0

Data Poisoning in Machine Learning: Why and How People Manipulate Training Data

https://towardsdatascience.com/data-poisoning-in-machine-learning-why-and-how-people-manipulate-training-data/(towardsdatascience.com)
Data poisoning is the act of altering a machine learning model's training data to manipulate its behavior, a change that becomes permanent once the model is trained. This technique can be used for criminal activity, such as weakening a cybersecurity model or causing fraudulent predictions for financial gain. Conversely, creators use data poisoning as a defensive tool to protect their intellectual property from being used to train generative AI models without permission. By using tools like Nightshade or Glaze, they can make their work either unlearnable or cause the resulting model to become useless, thus disincentivizing content theft.
0 pointsby ogg6 hours ago

Comments (0)

No comments yet. Be the first to comment!

Want to join the discussion?