prompt injection

Adversarial manipulation of model instructions to override intended behavior or leak sensitive data.