Breaking News

Microsoft has introduced a new feature called 'Computer Use' aimed to enhance agent capabilities by allowing them to treat websites and applications as tools for task completion.
Microsoft is redefining the way users interact with their computers through its latest innovation—'Computer Use', a new feature within Copilot Studio, designed to function like a human-like AI assistant. This cutting-edge capability allows the Microsoft AI assistant to navigate websites and desktop apps much like a person would—clicking buttons, selecting drop-down menus, and entering text in fields.
Now in research preview for select users, this PC virtual assistant supports most major desktop and web applications, including Edge, Chrome, and Firefox. With this development, Microsoft brings natural language PC control to life, allowing users to simply tell their PC what to do—without needing to know complex commands or coding.
By leveraging AI for PC navigation, the tool significantly boosts productivity and efficiency, especially for repetitive or data-entry-heavy tasks. The assistant uses contextual understanding and real-time screen recognition to complete workflows, making daily computer use more intuitive and streamlined.
A senior leader at Microsoft, Charles Lamanna, says that the new feature is meant to make AI agents appear like real users, as they would be able to click, choose, and type information just like a person would. While explaining, Lamanna stated, "This allows agents to handle tasks even when there is no API available to connect to the system directly. If a person can use the app, the agent can too."
Now in research preview for select users, this PC virtual assistant supports most major desktop and web applications, including Edge, Chrome, and Firefox. With this development, Microsoft brings natural language PC control to life, allowing users to simply tell their PC what to do—without needing to know complex commands or coding.
By leveraging AI for PC navigation, the tool significantly boosts productivity and efficiency, especially for repetitive or data-entry-heavy tasks. The assistant uses contextual understanding and real-time screen recognition to complete workflows, making daily computer use more intuitive and streamlined.
A senior leader at Microsoft, Charles Lamanna, says that the new feature is meant to make AI agents appear like real users, as they would be able to click, choose, and type information just like a person would. While explaining, Lamanna stated, "This allows agents to handle tasks even when there is no API available to connect to the system directly. If a person can use the app, the agent can too."
As part of Microsoft’s broader Windows AI features, this move aligns with the company's vision of deeply integrated, personalized AI assistants. Users can expect future updates that expand compatibility, refine user interaction, and support more personalized settings, eventually enabling seamless, voice-activated control across Windows devices.
Microsoft had earlier launched a feature called Actions, which enabled users to delegate routine tasks to Copilot, allowing them to concentrate on more important work. While Actions was primarily designed for individual and personal use, the newly introduced 'Computer Use' feature within Copilot Studio takes things a step further. Aimed at business and enterprise-level automation, this advanced tool enables AI agents to perform on-screen activities more comprehensively—making it better suited for automating complex workflows across professional environments.
This development represents a significant step toward a future where intelligent assistants become an everyday part of using PCs—bringing the power of AI into mainstream computing in ways that are both accessible and transformative.
See What’s Next in Tech With the Fast Forward Newsletter
Tweets From @varindiamag
Nothing to see here - yet
When they Tweet, their Tweets will show up here.