AI as TTPs is a recent posting by Bruce Schneier who has impeccable security credentials.
However, I'm not convinced that the paper he is highlighting is as groundbreaking as he is.
The authors of the paper also have great track records and include AI, but I think they're missing something basic that means that a single or group of TCMEs ("Trusted Capable Model Environment") can't actually do anything different than any other computation, subject to basic privacy controls (e.g. access control authorisation, auditing, encryption of data at rest, in transit, during computation (e.g. using FHE and TEEs etc etc).
But also:
a) visible communication in/out of the computation - i.e. information flow control
b) control over specificity of that data (i.e. differential privacy - can you tell if an individual record is present or not, to put it. crudely)
c) secure multiparty computations and zero knowledge systems
which the paper compares and contrast with their new TCME notion. However, I think the dimensions they use for comparison are a bit of a stretch.
The main problem I think is that the TCME seems to be indistinguishable from any other trusted program.
Any shared secret between models (e.g. federated or decentralised learning) is just the same for AI/ML as for any other algorithm. Perhaps the intersection of probability distributions looks a bit different to juse being able to say "the richest person is A" without knowing how rich A (B, or C) actually is - but in the end, the distribution has some moments and can be described by some number of those more or less precisely - a distribution of distributions can be aggregated with more or less precision or uncertainty (e.g. respecting differential privacy, and some widest level, or preventing set membership inference at the finest grain) - the model itself can be protected from outside model inversion attacks by various schemes, but I don't see what TTP function is provided that isn't just a different mix of existing techniques for providing trust.
No comments:
Post a Comment