I’ve recently come across this exciting newcomer in the tech world called DeepSeek. There’s been quite a buzz about it due to its advanced features that seem to outshine established models like OpenAI’s o1 reasoning model. In various benchmarks, such as math, science, and coding, DeepSeek consistently performs better, all while being developed at a significantly lower cost compared to its counterparts. As a tech enthusiast, I can’t wait to see where this innovative tool takes us!
Users have hesitated to fully embrace generative AI due to concerns about security and privacy. In recent times, there have been numerous cases of user data being utilized for AI training without explicit consent, causing a breakdown of trust between users and the technology.
Currently, there’s an alarming disclosure indicating that DeepSeek’s website might unknowingly transmit user login details to a Chinese telecommunications company, even though they are prohibited from conducting business in the U.S. (as reported by The Associated Press). It’s important to note that this restriction was imposed on DeepSeek due to suspected connections between China Mobile and the Chinese military.
According to Feroot Security researchers, the computer code within the DeepSeak website surreptitiously captures a user’s login details during account creation and sign-in procedures. Although the Chinese AI company acknowledges the storage of user data in its privacy policy, this recent report unveils more extensive connections between DeepSeek and China than previously assumed.
According to Feroot Security CEO Ivan Tsarynny:
It’s astonishing how we’re obliviously letting China monitor Americans, and yet we’re not taking any action. It’s hard to accept that such a situation could be coincidental. There are numerous oddities surrounding this. Don’t you recall the old saying ‘Where there’s smoke, there’s fire’? In this case, there seems to be a considerable amount of smoke indicating potential danger.
Although we can’t personally verify Feroot Security’s results, The Associated Press disclosed their report to a different team of cybersecurity specialists, and they affirmed the existence of harmful code within DeepSeek’s programming.
After a thorough examination and trials, neither group of cybersecurity experts could definitively determine if the computer code facilitated the transfer of user data to the Chinese government during North American login tests. Nevertheless, it’s important to note that this doesn’t completely dismiss the possibility that user data might have been shared with a Chinese telecommunications company.
In an interview with The Associated Press, Stewart Baker, a legal expert and advisor who previously worked for the Department of Homeland Security and the National Security Agency, shared his thoughts.
As an enthusiast, I’d express it like this: “Similar to TikTok, DeepSeek stirs up similar apprehensions. However, the data involved in this platform might be even more crucial for national security and personal matters compared to what people usually share on TikTok.
In other areas, OpenAI and Microsoft claimed that DeepSeek utilized their copyrighted material to educate its R1 V3 model. Contrarily, another source indicates that DeepSeek invested an estimated $1.6 billion, rather than the earlier believed $6 million, in developing its AI system.
The news emerges as DeepSeek experiences widespread acclaim and usage, overtaking ChatGPT to claim the position of the most frequently downloaded free AI application on Apple’s U.S. App Store.
Read More
- DUSK PREDICTION. DUSK cryptocurrency
- LDO PREDICTION. LDO cryptocurrency
- JTO PREDICTION. JTO cryptocurrency
- Looks Like DOOM 64 Is Getting a Native PS5 Port
- Marvel Rivals Season 1 is bringing the Fantastic Four — here’s the Invisible Woman in action
- Blockchain is the best fintech to ensure Sharia ethics — Web3 exec
- DGB PREDICTION. DGB cryptocurrency
- Mean Girls Star Has Perfect Idea for a Reunion After Christmas Movie Success
- EUR INR PREDICTION
- TRB PREDICTION. TRB cryptocurrency
2025-02-06 00:09