Bias in data can lead to unfair and discriminatory outcomes in AI systems. By actively seeking out and addressing bias, we can work toward ensuring fair treatment and nondiscrimination for all individuals and groups.

Understanding personal and sensitive attributes in data

In the context of bias in AI solutions, sensitive data refers to a characteristic or attribute that is closely associated with protected or vulnerable groups.

Sensitive features can include attributes such as race, ethnicity, gender, age, religion, sexual orientation, disability status, and socioeconomic background. These features are considered sensitive because they have historically been associated with discrimination or marginalization in various domains.

Personally identifiable information (PII) refers to any information that can be used to identify an individual uniquely. It includes personally identifiable attributes such as full name, social security number, date of birth, address, phone number, email address, financial account numbers, and more. PII is considered sensitive because its exposure or misuse can lead to privacy breaches, identity theft, or other forms of harm.

Get hands-on with 1200+ tech skills courses.