STORES HAVE BEEN closing in Oakland and San Francisco because of crime. Broad daylight robberies are continuing unabashed in Fremont. Car thefts and robberies in San Francisco have gone up in 2023. Other Bay Area cities have grievous crime stories to tell too. People are frustrated. Artificial Intelligence (AI) can help here, and the cities should leverage it in a big way.
Real-time crime detection and prevention can be made possible by integrating generative AI with surveillance camera systems. Some of the technology is already in use, for different purposes.
It is true that privacy concerns and accuracy issues causing false positives and false negatives remain, but they do not outweigh the benefits.
GPT-4 from OpenAI, of ChatGPT fame, is the power behind the “Be My Eyes Virtual Volunteer” app for people who are visually impaired, and it can answer any questions about the images uploaded through the app.
Research shows that the visual understanding capabilities of GPT-4 are good enough for general-purpose emotion recognition. The vision capabilities of the recently released GPT-4o have only improved since.
Incorporating similar functionality into cameras can put them to use in a multimodal fashion. The multimodal functionality can interpret the visual happenings on the streets and raise an audio alarm, send an SMS to law enforcement, or email the description to emergency dispatch. The possibilities are many, particularly when integrated with the smart city infrastructure that is on the rise.
Identifying threats, alerting law enforcement
AI-based object recognition in cameras can identify criminals, weapons, unattended items, and other anomalies anywhere in the crowd instantaneously and send alerts to law enforcement. If the governor signs it, Kansas legislation could provide grants to schools to use AI-powered cameras for gun detection.
An even better approach is to use drones with similar vision capabilities. Drones are being used to detect explosives in Ukraine and in other war zones. Drones equipped with the enhanced vision capabilities demonstrated by Project Astra at the recent Google I/O will be able to understand many intricate situations that may require intervention to keep the cities safe.
Google I/O also showcased enhanced audio understanding using AI. Incorporating these abilities into the drones can drastically improve their situational awareness and can help law enforcement substantially.
The cities should seriously consider working with tech giants and universities to make all this happen. The companies should prioritize such partnerships as part of their Corporate Social Responsibility and universities, to enhance their research profile.
When countries are deploying spy satellites with telescopes that can read car license plates in real-time from hundreds of miles away, it makes all the more sense to deploy drones for the safety of the citizens.
Drones are significantly used in ISR (intelligence, surveillance, and reconnaissance) and ISTAR (intelligence, surveillance, target acquisition, and reconnaissance) to help the forces in war zones. It is high time they are used to help law enforcement as well.
Drones can be controlled from ground stations several hundreds of miles away. It is possible to further automate the control using the vision understanding capabilities of AI to reduce the workload for law enforcement.
More than the human eye can see
LiDAR (Light Detection and Ranging) has been popularly used with self-driving cars. Drones equipped with LiDAR can scan large areas with a lot more precision, and capture finer details for analysis. Such drones are immensely helpful in covering dangerous environments such as dilapidated buildings, and poor visibility areas, which pose a fatal risk to human intervention, and are commonly involved in law and order issues.
Some drones come with Wide Area Motion Imagery (WAMI) systems that can provide a 360-degree panoramic video of the scene that can be helpful for detailed crime analysis and prevention. Others are provisioned with infrared cameras or thermal imaging that can search for criminals hiding in the dark.
Concerns about civil liberties and privacy can be worked around, for instance by human oversight, imposing stringent penalties for misusing privacy provisions, and involving humans in the loop to handle borderline cases to avoid misclassifications.
Last November, I gave a talk to the University of Bolton audiences on Social Sustainability, specifically focusing on privacy and a few mechanisms like the differential privacy used by the U.S. Census to protect it. There is certainly a need for more research in this space, but we do not have to wait.
There is a huge, untapped potential for using technology to keep our cities safer. The government should tap into it.
About the author
Vishnu S. Pendyala holds a Ph.D. and MBA in Finance and teaches machine learning and other data science courses at San Jose State University. He is a Public Voices Fellow with The OpEd Project. Opinions expressed are his own and not those of his employer or any other entity that he is affiliated with.
The post How Bay Area cities can leverage the power of AI to build safer, more secure communities appeared first on Local News Matters.