- The Australian Home Affairs department has banned the use of DeepSeek
- India’s finance Department has also warned against its use
- Concerns have arisen over the privacy and safety of AI models
The new DeepSeek AI chatbot has been making headlines, and even briefly became the world’s most popular chatbot within 10 days of its launch – overtaking existing models like ChatGPT and Gemini.
However new research has claimed the DeepSeek chatbot is ‘incredibly vulnerable’ to attacks, sparking national security concerns which have caused Australia’s Department of Home Affairs to ban the use of the model on federal government devices.
The policy, issued on the February 4 2025, determines the use of DeepSeek products and web services ‘poses an unacceptable level of security risk to the Australian Government’ and warns that departments must manage the risks from the ‘extensive collection of data’ and the exposure of the data to ‘extrajudicial directions from a foreign government that conflict with Australian law’.
Following the trend
Australia isn’t alone in this. India’s finance ministry has also asked its employees to avoid using any AI tools, like DeepSeek and ChatGPT for official purposes, and have cited the risk to confidential government documents and data.
Similarly, the US Navy banned the use of DeepSeek in ‘any capacity’ due to ‘potential security and ethical concerns’, and Italy’s data protection authority said it ordered DeepSeek to block its model in the country after the company failed to address the regulator’s privacy policy concerns, only providing information that the regulator found ‘totally insufficient’.
AI companies, like ChatGPT and DeepSeek collect vast amounts of data from all corners of the internet to train their chatbots, and have run into trouble against data privacy laws around the world.
Beyond that, some models have worrying privacy policies. For example, OpenAI has never asked people for consent to use their data, and it’s not possible for individuals to check what information has been stored.