How we us AI at Capacities
AI is everywhere now. The recent innovations around large language models are a huge breakthrough and fundamentally change how we work. It might not be an overnight change, but it will transform many areas of our lives and work.
At Capacities, we constantly try to adapt our processes to benefit from these new, exciting possibilities. We don’t want to hop on every hype but, at the same time, use it as effectively as possible to do better work in less time. Here’s a collection of examples of how we use AI at Capacities today as a team.
With the introduction of copilots and intelligent coding environments, our job as developers fundamentally changed. We’re using auto-complete to code faster or write short prompts to get a first draft of a few lines of code written. We use the AI chat in our IDE to find bugs and flaws in our code.
At the moment, we’re also experimenting with code composers. They allow you to specify what you want to do in natural language, and the AI will then manipulate and extend your code to achieve that goal. They don’t work well in many cases, but they sometimes blow our minds. We found them to be very useful for writing encapsulated units of code or writing small functionalities in frameworks or libraries we’re not very familiar with. Here, they can really reduce the time from multiple hours to a few minutes. That is just incredible.
Next to the AI in our IDEs, we constantly use the AI assistant in Capacities for development. We use it to understand new technologies, libraries, or frameworks, draft prototypes for new implementations, or iterate on our concepts. It’s truly remarkable how valuable this process is. Sometimes, you’re not experienced in a field, and you can tremendously benefit from the “common wisdom” on the internet by asking the AI. It helped us to find new libraries or convinced us to use one approach over the other by outlining its pros and cons.
We also heavily rely on the AI assistant for writing. We don’t use it to generate articles or posts for us, but we use it to critique our chain of thought, find flaws in our arguments, improve our grammar and style of writing, and much more. The AI is like a writing tutor if prompted with the written questions. You don’t have to accept everything, but you can learn a lot by getting an opinion on your writings.
Next to that, we’re, of course, constantly using all integrated AI features in Capacities. Lately, we realized that AI tagging and property autofill are absolute game changers. It removes a lot of mental load by helping you to connect new content to what you already have.
There are many more ways of using AI, and it will for sure revolutionize many more workflows. We have exciting ideas for AI in Capacities and for using LLMs to improve the overall user experience, like providing a chat interface for asking questions about Capacities and much more!
The possibilities are endless, and one big challenge is understanding when AI is useful and equally important when it’s not. We had to learn how to Google, and now we need to learn how to use LLMs. That journey is fascinating!
📖 Continue reading
Why did we add an AI chat to Capacities, although ChatGPT already exists?
Steffen Bleher
How we decide which new features to add
Steffen Bleher
Why we are as transparent as possible
Steffen Bleher