
According to Stack Overflow's 2024 survey, 76% of developers are using or planning to use AI tools—they're now just part of the job. They help with mundane tasks but can be annoying when they confidently generate nonsense. While YouTubers build "billion-dollar startups" with ChatGPT prompts and AI agents are taking over the world every week, real teams are still figuring out how to use these tools effectively. Today, mastering AI assistance is as fundamental as coding or system design—we need to adapt and fast.
The problem is these tools are essentially still in an R&D stage—they change constantly, copy each other, and deserve credit for solving previously unsolved problems. They all lack clear usage guides. Even Copilot, despite being more established, lacks clear tutorials and best practices. The solution? We'll do what developers do best—organize and create a framework ourselves.
...and also "Quantum leap", "Paradigm shift", "Agents are coming", you name it. While these tools are indeed transforming our workflow, the change is more practical: developers now operate like team leads, managing AI assistants instead of writing code directly. The core skills have shifted to designing, planning, describing, and reviewing.
The main UX concepts these tools introduce are:
Getting familiar with "suggestions" isn't complicated. AI assistants started with them, which first caught our attention. Chats are straightforward: insert files into context, iterate, apply, and validate the results.
Composer-type tools present more challenges to master, requiring a learning curve and some non-obvious approaches. Currently, the Cursor editor offers the most approachable "composer" tool, while Copilot follows closely with "Copilot Edits," having recently introduced agent-based workflows.
To become proficient with composers, you need to understand three key concepts:
Let's examine each of these.
As team leads rather than just developers, we should begin any new project or major feature by creating a Design Document or clear Product Requirements Document. This practice develops strong engineering and product thinking while saving substantial implementation time. The best part is that these documents can be:
To create these documents, first gather requirements from humans, then consult the reasoning model in Chat. Both Copilot and Cursor have built-in reasoning models suited for this task. OpenAI's o1
and o3-mini
are available by default, while Cursor's Chat supports DeepSeek-R1 (though not yet in its Composer) – all excellent tools for this purpose.
A good practice is storing design documents at your repo's top level (we'll use a requirements
folder) organized by features, with a ProjectOverview.md
in the root. Here's an example structure for a Twitter web app's requirements:
requirements/
├── ProjectOverview.md # Core product description
└── Features/
├── Authentication.md # User registration
├── Tweet.md # Tweet CRUD
├── UserProfile.md # Profile management
├── Engagement.md # Likes, retweets
├── Infrastructure.md # Storage, caching, etc
└── ...
If everything is properly set up, adding a design document for a new feature is as simple as writing this prompt:
Storing instructions in your codebase offers clear advantages: version control, easy maintenance, and standard PR workflow. However, non-technical team members like Product Owners, managers, and UX designers may need access without using git. Here are three solutions:
1. Store everything in Notion, publish instruction pages, and inject them as documentation using @Docs
shortcut
.md
files and stores them in the repository
Once your instructions are accessible in your editor, switch to the composer and start implementing. This leads us to organizing the Rules.
Currently, only Cursor supports "rules" - direct implementation instructions for specific files/folders. This feature will likely spread to other editors, including VSCode Copilot, which currently only offers "prompt files" that can't be directly attached to the codebase.
Cursor's rules are more comprehensive - imagine CONTRIBUTING.md
combined with linter rules and enhanced by LLM capabilities. These rules are product-agnostic, shareable, and effectively transfer knowledge, best practices, and implementation details across teams and library users.
Rules can be created via the command palette and are stored in your project's .cursor/rules
folder with a .mdc
extension. This format enables advanced features like @mentioning
specific files in your codebase. It's highly recommended to commit these rules to your repository and collaborate on improving them. Here's the workflow for using rules:
Many libraries urgently need AI rules. From a frontend developer's perspective, I would benefit from having them for TanStack Query, React Spring, Firebase and many-many more. These rules would save significant time and help prevent common mistakes that developers make when learning new technologies.
Remember to include all relevant context - the more quality data you provide, the better results you'll get. Cursor editor has an advantage over Copilot here by allowing several types of context:
After mastering these tools, your next step is optimizing both your individual and team performance. But what's the path forward from here?
You'll always face a tradeoff between simplicity and control, between automated solutions and manual decision-making. If you're willing to dive deep and aren't afraid of dealing with bugs, performance challenges, and rough edges, consider trying Cline (or its fork Roo-Code, which has a slightly different philosophy).
Both tools are designed to provide as much clearance on what’s really going on under the hood as possible:
The killer feature is that Cline can actually run and debug your application - it's real and functional, as you'll see when you try it.
If all this interests you, check out Addy Osmani's recent article, which provides an excellent introduction to these editors.
Adopting these tools isn't a simple journey, and don't expect to write "the whole project from scratch in under 5 minutes." Yet, this is a clear path forward.
The technology is already there, but we're missing a robust AI-integrated workflow that organizes the entire team - not just developers, but crucially managers and designers - around these new tools. AI can feel intimidating, and sharing its impact may seem uncomfortable at first (like telling your team lead that AI wrote 80% of a feature through careful configuration). However, software development will only evolve when these tools become integral to team workflows. The most successful transitions happen in teams that foster open discussion about AI experiences, collaborative tool exploration, and actively contribute their learned best practices to the broader development community.