A man spent a year in jail on a murder charge that was based on disputed AI evidence. Now the case has been closed • The Register

In short The case against a man charged with murder was dismissed by a judge after prosecutors withdrew disputed evidence from an AI-identified shooting sound.

Michael Williams, 65, who has denied any wrongdoing, spent 11 months in jail awaiting trial for allegedly killing Safarian Herring, 25.

It is said that in May of last year, Williams was driving through Chicago one night hoping to buy cigarettes. Herring motioned for him to get out and Williams, recognizing the young man from the neighborhood, let him get into his car. Shortly after another vehicle pulled up next to it, and a person in a passenger seat pulled out a gun and shot Herring in the head, Williams told police. Herring’s mother said her son, an aspiring chef, was shot and killed two weeks earlier at a bus stop.

Herring, who was taken to hospital by Williams, died of a gunshot wound and Williams was ultimately charged with his murder. ShotSpotter, a company that operates microphones spread across U.S. cities including Chicago, immediately detects and identifies cops using machine learning algorithms.

Prosecutors said ShotSpotter picked up a shooting sound where Williams was seen on surveillance camera footage in his car, highlighting all of this as evidence that Williams shot Herring on the spot. Police did not name a motive, had no eyewitnesses, and could not find the weapon used in the attack. Williams did, however, have a criminal record, having served time for attempted murder, robbery and unloading a gun when he was younger, and said he has changed his life significantly since then. He was toasted by detectives and reserved.

Importantly, Williams’ attorneys Public Defenders Lisa Boughton and Brendan Max said records showed ShotSpotter initially detected what looked like fireworks a mile away. This was then reclassified by the staff at ShotSpotter – one of whom worked for the Chicago Police Department – as a gunshot at the intersection where and when Williams was seen on camera.

ShotSpotter insisted that he had not incorrectly altered any data to favor the police case, and said that regardless of the initial real-time alert, his evidence of the gunshot was the result of a follow-up forensic analysis, which was submitted to the courts.

After Williams’ attorneys asked the judge handling the case to investigate, the prosecution last month withdrew the ShotSpotter report and called for the case to be dismissed based on insufficient evidence, which the prosecution said. judge accepted. Williams is a free man again.

“I kept trying to figure out, how can they get away with using technology like this against me,” Williams told The Associated Press for a full investigation into the case released this week. “It is not fair.”

The internet used our AI to create NSFW images!

Startup Kapwing, which created a web application that uses computer vision algorithms to generate images for people, is disappointed that internet users used the code to produce NSFW material.

The software uses a combination of VQGAN and CLIP – made by researchers at Heidelberg University and OpenAI, respectively – to turn text prompts into images. This approach was popularized by artist Katherine Crowson in a Google Collab notebook; there is a Twitter account dedicated to showing this type of computer art.

Kapwing had hoped that his implementation of VQGAN and CLIP on the web would be used to make art from user requests; instead, we are told, it was used to make dirt.

“Since working at Kapwing, an online video editor, building an AI art and video generator seemed like a perfect fit for us,” said Eric Lu, Co-Founder and CTO of Kapwing.

“The problem? When we allowed anyone to generate art with artificial intelligence, hardly anyone used it to create actual art. Instead, our AI model was constrained. to create videos for random entries, lagging queries and NSFW intents. ”

Submitted guests ranged from ‘naked woman’ to the bizarre ‘chocolate covered thong bikini’ or ‘gay unicorn at a funeral’. The funny thing is, the images created by AI aren’t even that realistic or sexually explicit. Below is an example output for “naked woman”.

vqgan_clip

Click to enlarge

“Does the internet need NSFW content so badly that they type it anywhere?” Or do people tend to try to abuse AI systems? ”Lu continued.“ In any case, the content produced must have [been] disappointing for these users, as most of the representations produced by our models were abstract. “

Intel terminates its RealSense business

Intel closes its wing of RealSense computer vision products. The business unit’s chips, cameras, LiDAR, hardware modules and software were intended for things like digital signage, 3D scanning, robotics, and facial authentication systems.

Now the outlet has been unplugged and RealSense boss Sagi Ben Moshe is leaving Intel after a decade in the semiconductor goliath.

“We are terminating our RealSense business and transferring our talents, technology and products into computer vision to focus on advancing innovative technologies that better support our core business and our IDM 2.0 strategy,” said a spokesperson. Intel’s word to CRN.

All RealSense products will be phased out, although it looks like its stereo depth-perception cameras will remain, to some extent, according to IEEE’s Spectrum. ®


Source link

About Larry Struck

Larry Struck

Check Also

Shumake makes headway on funny car license

Travis Shumake made several passes at the Texas Motorplex on Sunday afternoon as he prepared …

Leave a Reply

Your email address will not be published. Required fields are marked *