Upgrade to Linux Mint 22.1 is Now Available: Here's How to Do That!
The upgrade path for Linux Mint 22.1 is now open! Here's a quick guide to help you out.
Rust-based AI terminal gets a superpower with this new feature!
Warp is a Rust-based, cross-platform terminal app that tries to βreimagine with AI and collaborative tools for better productivityβ. Its adoption has been growing consistently, with many users finding its AI-powered features under the Warp AI umbrella useful.
Coming off the introduction of Warp for ARM, the developers have introduced a new feature to Warp AI called βAgent Modeβ, which, they claim, embeds LLM directly into the terminal for βmulti-step workflowsβ.
Let's check it out.
A new way to interact with AI on Warp, Agent Mode, understands plain English text (including commands) to suggest code, give outputs to guide you, autocorrect itself, learn/integrate with any service that hosts public documentation or a help page, and more.
Thanks to its natural language prowess, users can simply ask it to do things, similar to having a conversation with a colleague or a subject-matter expert. It has tight integration with the CLI, enabling it to interface with any service without any extra configuration overhead.
The developers add that:
If the service has a CLI, an API, or publicly available docs, you can use Agent Mode for the task. Agent Mode has inherent knowledge of most public CLIs, and you can easily teach it how to use internal CLIs by asking it to read their help content.
They also stress that auto-detection of natural language happens locally, data is only sent to their servers when you proceed with the request to process the entered data with Warp AI. You can also disable auto-detection, if you'd like, but, it's enabled by default.
They clarified that part with an example:
If Warp AI needs the name of your git branch, it will ask to run git branch and read the output. You will be able to approve or reject the command. Weβve designed Agent Mode so that you know exactly what information is leaving your machine.
Of course, it's no secret that Warp AI is powered by OpenAI, so naturally all the input is sent to them, with Warp assuring that OpenAI doesn't train their models on such data. For users who don't prefer that, they could switch their LLM provider by opting for Warp's enterprise plan.
If you were wondering how Agent Mode could help, here's an example given by the developers.
They illustrate how you can deal with the βport 3000 already takenβ error.
After attaching context to a query, Users can type something like βfix itβ, and Agent Mode will start working on providing solutions π€―
Here's a brief demo of its AI agent feature that we tried ourselves.
If this new feature of Warp interests you, then you can head to its official website, where you will find packages for Linux and macOS, with the Windows app coming soon.
Users of every plan tier will be able to access this, with the free plan including up to 40 AI requests per month, and users of the paid plans having more.
For existing users, you will now see the new Agent Mode replacing the first-gen Warp AI chat panel, with the βtype # to ask Warp AIβ feature being retained for the short-term, with plans to retire it in the near future as Agent Mode matures.
If you would like to learn more, the official announcement blogdocumentation has further information.
π¬ I know most of you do not like proprietary apps like this. But, if you give Warp a try, do let me know in the comments below.
Stay updated with relevant Linux news, discover new open source apps, follow distro releases and read opinions