Apple has reportedly restricted internal use of ChatGPT and other AI tools to prevent possible data leaks. Any data shared with ChatGPT is retained to help train the model, with the developers having access to it. This would, of course, include any sensitive data an employee shares. Apple’s ChatGPT restrictions come as generative AI takes over the tech world at an alarming speed, such as Roblox’s AI creation tools in the works.
As reported by The Wall Street Journal (via Engadget), Apple has restricted employees’ use of OpenAI’s language model, though it’s unclear in what way. The Economist Korea previously reported that Samsung has restricted employees’ use to one kilobyte or just over a thousand characters of text.
The Samsung ChatGPT debacle apparently included one employee asking ChatGPT to review database source code for errors, while another uploaded a recorded meeting for the AI to write up the minutes. ChatGPT devs could theoretically see all this data, including proprietary source code – which is, of course, not a good thing for a company.
Apple is reportedly working on its own language model, though we don’t know much about that at the moment. The latest news from the Apple world is iPad’s Final Cut and Logic Pro additions, which you can check out in the video below.
For more beyond Apple’s ChatGPT restrictions, check out our guide to the best iPads and the best gaming phones on the market today.