Monday, November 11, 2024

catastrophic unlearning...

Unlearning in AI is quite a tricky conundrum. 

we really ought to do it, because 

1/ we might be asked by a patient to remove their medical record from the training data as they didn't consent,. or we breached privacy in accessing it... 

2/ we might be iinformed that some datum was an adversaries input designed to drift our model away from the truth, 

3/ it might be a way to get a less biased model than simply adding more representative data (shift the distribution of training data towards a better sample could be done either way). 

There may be other reasons.

The problem technically is that the easiest way to do unlearning is to retrain from the start, but omitting the offending inputs. This may not be possible, as we may no longer have all the inputs.

A way some people propose is to apply differential privacy to determine if one could remove the effect of having been trained on a partcular datum, without removing that training item - this would naively invovle adding training with an inverse of that datum (in some sense) - the problem is that this doesn't actually remove the internal weights in a model that might be complex (convolutions) of that with previous and subsequent training data. And hence later training still might again reveal that the model "knew" about the fobidden input.

But there's another problem - there's also the value of the particular data to the model in terms of its output - this is kind of like a reverse of differentially private arguments. Two examples

a/ rare accident video recording (or even telemetry) data for training self-driving cars

b/ dna data from indoviduals with (say) very rare immunity to some specific medical condition (or indeed, very rare bad reaction to a tratment/vaccine)

These are exactly the sorts of records you want, but might specifically be the kinds of things indoviduals want removed (or adversaries input to really mess with the robot cars and doctors).

Perhaps this might make some of the Big AI Bros think about what they should be paying people for their content too.

No comments:

Blog Archive

About Me

My photo
misery me, there is a floccipaucinihilipilification (*) of chronsynclastic infundibuli in these parts and I must therefore refer you to frank zappa instead, and go home