MFA Your Grandma
I was born in 1990 (a young whippersnapper, I know), which means I was too young for commercials asking “It’s 10pm, do you know where your child is?” but old enough to remember PSAs at the end of Saturday morning cartoons and random episodes of TV shows where Spider-man or friends had to fight the personification of drug addiction.
Those were still the days of Scruff McGruff sending you counter-crime comic books if you wrote in, concerns that drug dealers were giving out free drugs at elementary schools, fear of adults putting edibles in Halloween buckets, and paranoia of kidnappers in vans with candy. Not to make too much light of serious fears, but the age in which the internet emerged to dominance was not without its security concerns.
To help deal with stranger danger, my parents developed a challenge and response for the family. The idea was that if someone came to pick us up because “your parents said to come get you” (in other words, “to kidnap you”) we were only to believe them if they new the password. The password was memorable, silly, and almost never used except when our parents would test us. Trusted family friends had it, as did—I assume—our grandparents and godparents.
The thing that my dad understood then, and that I rely on now professionally, is that control over our own identity and trust is powerful. Why do you use a call-sign when playing with a walky-talky as a kid? Because you’re keeping control over your identity so that people can’t abuse it to build trust they don’t deserve, and it makes any information they overhear less valuable.
This applies to secure messaging and security operations, even in the emerging age of AI: Security requires trust requires identity.
Think about a squad of soldiers securing a hill. To have “security,” you need to have control over who has access to that hill and what they are allowed to do when they get there. That security relies on both trust and a boundary, in this case a literal point on the ground that those not trusted must not cross. In computers, that’s usually roles, functions, and information someone may not have. To enforce trust, you need to know someone’s identity. And identity must be authenticated.
In warfare, we’ve learned quickly that identity cannot be authenticated with a single, falsifiable piece of information. “Oh you’re wearing the same uniform as me! You must be on my side” breaks down rapidly.
Nations have solved that problem in a variety of ways over the years, one example is the emergence of the challenge coin (https://en.wikipedia.org/wiki/Challenge_coin#Origins) and its legendary use by a downed WWI pilot who needed to prove he was who he said he was. Another is the classic call and response: if you want to return from “using the woodline” as a bathroom, you better know your platoon’s password to get back through security.
In cybersecurity, identity is usually first proven with a username or user ID. These are easy to falsify, so we add on one factor of authentication, usually a password. But, passwords can be guessed or stolen, so we’ve developed three forms of authentication:
Type 1 – Something you know (e.g. password, pass phrase)
Type 2 – Something you have (e.g. a token, smart card, or code-generating-device)
Type 3 – Something you are (e.g. biometrics like fingerprints)
In the post-COVID, high-speed, work from home environment we work in we’ve been defaulting to trusting someone’s identity because 1) They say they are John from IT and 2) they look and sound like John from IT on the video call. So called generative AI is challenging us to re-engage with security, trust, and identity: if you can’t trust what you see and hear, how can you be secure?
Fortunately, security principles are largely more timeless than the technologies that implement them. Can’t trust who they appear to be (Type 3)? Can we marry their audio and video back to something they know or something that they have? Can we slow down and turn to multi-factor authentication?
What’s one thing that you and your grandma both know? What’s a word or phrase you can come up with in advance first to force the conversation about security, scams, and deep fakes with grandma, but second to be prepared when someone calls her pretending they’re you and asking to be bailed out of jail?
To steal from a panelist I heard last week, how can you “MFA your grandma?” (Sorry, sir! I did not record your name)