An instructor from the Office of Enterprise Technology Services (ETS) released a recorded lesson for software developers that emphasized building privacy and security into applications from the start, not as an afterthought. The session, presented as a recorded lecture, combined practical demonstrations (SQL injection, cross-site scripting, hashing) with privacy guidance and legal context.
The instructor opened by telling trainees that developers are data custodians and that “with great power kind of comes the great responsibility”. He urged developers to plan privacy at the design phase, describing privacy-by-design and the CIA security tradeoffs — confidentiality, integrity and availability — that should shape early architecture decisions.
Why it matters: the lesson tied everyday development choices to real harms. Using a simple SQL example the instructor showed how unsanitized input can make a query always true and allow unauthorized access, and demonstrated how treating user input as text or escaping it prevents SQL injection and XSS attacks. He stressed never storing plaintext passwords: “I will not trust the user's input. I will hash all the passwords”, and advised strong password rules (at least eight characters, mixed case, numbers and symbols) and one-way hashing for storage.
Technical hygiene and defaults: the session recommended TLS/HTTPS (at least TLS 1.2) to protect data in transit, timely patching of third‑party dependencies, network segmentation for sensitive data, and lifecycle practices — delete or anonymize data when it is no longer needed. The instructor recommended privacy-protective defaults (opt-in tracking) and techniques such as aggregation and data fuzzing to reduce re-identification risk for small samples.
Legal and reputational context: the lesson framed privacy work against high-profile incidents and laws. The instructor cited the Equifax breach (about 147 million people affected) and Ashley Madison as examples of how breaches cause extensive financial and reputational harms. He summarized major legal frameworks developers should be aware of, including the EU General Data Protection Regulation (GDPR), California’s CCPA/CPRA, the Children’s Online Privacy Protection Act (COPPA) for users under 13, and FERPA for education records, and warned of penalties for noncompliance. The lesson also pointed learners to interactive demos and to security guidance such as the OWASP Top 10 for common web vulnerabilities.
Takeaways and resources: the instructor closed with a developer pledge and links to hands-on demos: validate and sanitize inputs, hash passwords, encrypt data in transit and at rest when needed, collect only necessary information, and make privacy notices clear for users. He left interactive resources and a short quiz for practice and encouraged students to discuss ethical tradeoffs in small groups.
Next steps: the instructor invited participants to use the links on the last slide to practice the demos and submit projects; the slide deck and interactives are provided as the recommended follow-up.