Analytics In The Hands of Children

It’s all there for the finding. All we have to do is look. It seems so obvious to so many, particularly those with an abiding faith in the power of algorithms to accomplish the task of sifting through millions of exchanges online for key words that could save lives, unearth crimes, help people. Who doesn’t want to help people?

Dozens of female high school students gathered at Maxar Technologies on Friday for an event to encourage them to further their interests and capabilities in STEM. The keynote speaker at the American Heart Association’s “Bring Stem To Life” event was 18-year-old Sherya Nallapati, a science, technology, engineering and math student who is working on a technology that scans social media profiles for potential mass shooting threats.

Getting girls interested in STEM is a goal on its own.  It’s an oddity, since the majority of college students are female, they can major in science or math if they so choose and to the extent the rationalization is that guys in STEM are unwelcoming to them (as if being made to feel special and welcome is the primary criterion for what one studies for their future career), they could seize control of STEM majors by their sheer numbers and leave the nasty male geeks in the dust. But anything involving women in STEM gets extra attention.

Then there’s the use of STEM for socially-beneficial causes. It’s not about getting to the moon or curing cancer. It’s not about two or three lenses on the new iPhone. It’s about “potential mass shooting threats,” which means saving lives. Who doesn’t want to save lives?

Nallapati said her technology, which she dubbed “#NeverAgainTech,” will one day scan public social media pages to look for potential concerns for mass shootings. By using historical analysis of perpetrators of the past, the technology would use an algorithm to search for key words, use of firearms and other items. She said the technology would eventually rate an online user as a possible threat, or not, which could help law enforcement gain intelligence on those individuals.

What could possibly go wrong? It’s even got a cool hashtag. Technology in the hands of high school girls that would “rate online users as a possible threat” and “help law enforcement gain intelligence on those individuals” could certainly help find the next shooter. And a million people who aren’t shooters, who would never be a threat, and put them on a list of people rated as a threat, and put their information in the hands of police, who could then scrutinize them and, should they do anything else wrong, approach them in full military regalia, weapons drawn, since they’re rated as threats.

And maybe a few will die so some high school girl, Sherya Nallapati, can feel good about her STEM interests. Like so many young people, her interest was piqued by personal experience.

Nallapati said her software could one day help save lives. She said some of her family members were in STEM Highlands Ranch when two gunmen entered the school, killing senior Kendrick Castillo and injuring others. She believed her technology could one day help flag some individuals as potential concerns based off of their social media — possibly giving law enforcement a head start on stopping a shooting.

While that doesn’t quite make her a victim, it’s close enough to protect her from criticism. After all, when someone loses a family member to tragedy, only a truly awful person would question their efforts. But she was hacked, so that makes her personally a survivor.

“It has only been four years (since I gained interest in STEM). It all started when I got in to cyber security because my laptop was hacked,” Nallapati told CBS4’s Dillon Thomas.

Despite this, even someone on the prosecution side recognized that this idea presented a significant potential for danger.

Tom Raynes, head of the Colorado District Attorney’s Council, told CBS4 the software idea was concerning, citing first amendment freedoms for those using social media platforms. Without further knowledge of how the software worked, Raynes said it was hard to comment on if the software would put innocent people on the radar of law enforcement.

Yet, they had this covered.

Nallapati said she was working with online privacy advocates to make sure her technology did not infringe on those the technology surveyed. Though she said the software would “profile” some users, Nallapati said she had a team of girls researching how to make sure the software did not flag individuals improperly or illegally.

Who better than a “team of girls” to decide the legality and propriety of this idea? If you can’t trust free speech in the hands of high school girls, who can you trust?

There’s nothing new about the problems with relying on algorithms. They do whatever they’re told to do, and they may well capture the handful of people at risk of being mass shooters as well as millions of people who aren’t. If you’re of the view that putting millions of innocent people in the crosshairs lest a few not be found, then this won’t trouble you much.

But what distinguishes this effort is that it aligns with other values, such as women in STEM and children as social saviors. Consider Greta Thunberg, whose qualifications as a voice against climate change are nonexistent, and yet became so huge a focal point that there were cries of outrage over her being denied the Nobel Prize.

As heartwarming as it may be to see young women interested in STEM, to have something done to protect the children from the horror of mass shootings, to use technology to solve the traumas of modern society, these combinations give rise to the potential of praise and a blind eye to deeply troubling schemes likely to cause huge problems. It’s wonderful that 18-year-old Sherya Nallapati is engaged in this STEM project. But the project is a terrible idea, even in the hands of children.

11 thoughts on “Analytics In The Hands of Children

      1. Onlymom

        Yep and in the old days you know the early 60’s. What you would see is mommy’s hand headed toward the back of the head of dear ole me!.

  1. phv3773

    It’s kind of cute that Nallapatti thinks she’s out ahead of Big STEM (or Big AI) with this idea.

  2. B. McLeod

    They will need to build a lot more jails to lock up all these potentially threatening people. These accommodations should come to be called “STEM cells.”

  3. Ed

    Children and adults get caught up with the trend in TV series and movies where a profiler finds the bad guy. Fortune tellers are about as accurate.

    But, if they can build a algorithm that has a 90% success rate in picking a randomly selected winner in a horse race, then I might pay attention to children playing witch hunters.

    I respect your thoughts and I thank you for sharing them.

Comments are closed.