Microsoft will launch Copilot for Security next month, bringing generative AI chatbot technology into the field of cybersecurity. Copilot for Security, designed specifically for cybersecurity professionals, aims to help them effectively defend against various threats. Unlike Copilot for Microsoft 365, which is charged on a monthly basis, this version adopts a more flexible pay-as-you-go model. After its official launch on April 1st, Microsoft will charge a standard fee of $4 per hour based on the actual usage of the enterprise.
Copilot for Security integrates OpenAI's GPT-4 and Microsoft's proprietary security-specific models to provide cybersecurity personnel with the latest security event information and threat summaries in the form of a chatbot. Since its testing began last year, this chatbot has been able to access real-time security threat information and Microsoft's daily collection of 78 trillion threat intelligence signals.
Copilot for Security also has collaboration capabilities, allowing cybersecurity employees to share information and communicate through a pinboard. Additionally, it can summarize security incidents for reporting purposes, helping users gain a clearer understanding of their security status. Similar to other AI chatbots, users can use natural language input, upload files for analysis, and even have Copilot for Security analyze code. All operation records will be saved in a historical log for future auditing and tracing.
This pay-as-you-go pricing model aims to meet the AI cybersecurity needs of enterprises of different scales. Microsoft stated, "We will adopt a simple pricing model that is applicable to both independent Copilot experiences and embedded experiences in the Microsoft security product portfolio. The consumption model means that enterprises can start small, experiment and learn quickly without incurring upfront fixed costs based on devices or users."
Microsoft's move to promote AI applications in the field of cybersecurity comes at a time when the company has been targeted by state-sponsored hacker attacks from Russia. Previously, a hacker group named Nobelium successfully infiltrated the emails of Microsoft executives, stole some source code, and gained access to the company's source code repository and internal systems.
In recent years, Microsoft has made significant efforts to enhance the security of its software in response to Azure cloud security incidents. The launch of Copilot for Security is one of the important measures taken in this regard.