×
Well done. You've clicked the tower. This would actually achieve something if you had logged in first. Use the key for that. The name takes you home. This is where all the applicables sit. And you can't apply any changes to my site unless you are logged in.

Our policy is best summarized as "we don't care about _you_, we care about _them_", no emails, so no forgetting your password. You have no rights. It's like you don't even exist. If you publish material, I reserve the right to remove it, or use it myself.

Don't impersonate. Don't name someone involuntarily. You can lose everything if you cross the line, and no, I won't cancel your automatic payments first, so you'll have to do it the hard way. See how serious this sounds? That's how serious you're meant to take these.

×
Register


Required. 150 characters or fewer. Letters, digits and @/./+/-/_ only.
  • Your password can’t be too similar to your other personal information.
  • Your password must contain at least 8 characters.
  • Your password can’t be a commonly used password.
  • Your password can’t be entirely numeric.

Enter the same password as before, for verification.
Login

Grow A Dic
Define A Word
Make Space
Set Task
Mark Post
Apply Votestyle
Create Votes
(From: saved spaces)
Exclude Votes
Apply Dic
Exclude Dic

Click here to flash read.

In recent years, there has been a growing interest in the effects of data
poisoning attacks on data-driven control methods. Poisoning attacks are
well-known to the Machine Learning community, which, however, make use of
assumptions, such as cross-sample independence, that in general do not hold for
linear dynamical systems. Consequently, these systems require different attack
and detection methods than those developed for supervised learning problems in
the i.i.d.\ setting. Since most data-driven control algorithms make use of the
least-squares estimator, we study how poisoning impacts the least-squares
estimate through the lens of statistical testing, and question in what way data
poisoning attacks can be detected. We establish under which conditions the set
of models compatible with the data includes the true model of the system, and
we analyze different poisoning strategies for the attacker. On the basis of the
arguments hereby presented, we propose a stealthy data poisoning attack on the
least-squares estimator that can escape classical statistical tests, and
conclude by showing the efficiency of the proposed attack.

Click here to read this post out
ID: 130193; Unique Viewers: 0
Voters: 0
Latest Change: May 16, 2023, 7:32 a.m. Changes:
Dictionaries:
Words:
Spaces:
Comments:
Newcom
<0:100>