Connect with us

Technology

Perceptron: Risky teleoperation, Rocket League simulation, and zoologist multiplication

Published

on

Research in the field of machine learning and AI, now a key technology in practically every industry and company, is far too voluminous for anyone to read it all. This column, Perceptron (previously Deep Science), aims to collect some of the most relevant recent discoveries and papers — particularly in, but not limited to, artificial intelligence — and explain why they matter.

This week in AI, researchers discovered a method that could allow adversaries to track the movements of remotely-controlled robots even when the robots’ communications are encrypted end-to-end. The coauthors, who hail from the University of Strathclyde in Glasgow, said that their study shows adopting the best cybersecurity practices isn’t enough to stop attacks on autonomous systems.

Remote control, or teleoperation, promises to enable operators to guide one or several robots from afar in a range of environments. Startups including Pollen Robotics, Beam, and Tortoise have demonstrated the usefulness of teleoperated robots in grocery stores, hospitals, and offices. Other companies develop remotely-controlled robots for tasks like bomb disposal or surveying sites with heavy radiation.

But the new research shows that teleoperation, even when supposedly “secure,” is risky in its susceptibility to surveillance. The Strathclyde coauthors describe in a paper using a neural network to infer information about what operations a remotely-controlled robot is carrying out. After collecting samples of TLS-protected traffic between the robot and controller and conducting an analysis, they found that the neural network could identify movements about 60% of the time and also reconstruct “warehousing workflows” (e.g., picking up packages) with “high accuracy.”

Teleoperations

Image Credits: Shah et al.

Alarming in a less immediate way is a new study from researchers at Google and the University of Michigan that explored peoples’ relationships with AI-powered systems in countries with weak legislation and “nationwide optimism” for AI. The work surveyed India-based, “financially stressed” users of instant loan platforms that target borrowers with credit determined by risk-modeling AI. According to the coauthors, the users experienced feelings of indebtedness for the “boon” of instant loans and an obligation to accept harsh terms, overshare sensitive data, and pay high fees.

The researchers argue that the findings illustrate the need for greater “algorithmic accountability,” particularly where it concerns AI in financial services. “We argue that accountability is shaped by platform-user power relations, and urge caution to policymakers in adopting a purely technical approach to fostering algorithmic accountability,” they wrote. “Instead, we call for situated interventions that enhance agency of users, enable meaningful transparency, reconfigure designer-user relations, and prompt a critical reflection in practitioners towards wider accountability.”

In less dour research, a team of scientists at TU Dortmund University, Rhine-Waal University, and LIACS Universiteit Leiden in the Netherlands developed an algorithm that they claim can “solve” the game Rocket League. Motivated to find a less computationally-intensive way to create game-playing AI, the team leveraged what they call a “sim-to-sim” transfer technique, which trained the AI system to perform in-game tasks like goalkeeping and striking within a stripped-down, simplified version of Rocket League. (Rocket League basically resembles indoor soccer, except with cars instead of human players in teams of three.)

Rocket League AI

Image Credits: Pleines et al.

It wasn’t perfect, but the researchers’ Rocket League-playing system, managed to save nearly all shots fired its way when goalkeeping. When on the offensive, the system successfully scored 75% of shots — a respectable record.

Simulators for human movements are also advancing at pace. Meta’s work on tracking and simulating human limbs has obvious applications in its AR and VR products, but it could also be used more broadly in robotics and embodied AI. Research that came out this week got a tip of the cap from none other than Mark Zuckerberg.

Simulated skeleton and muscle groups in Myosuite.

Simulated skeleton and muscle groups in Myosuite.

MyoSuite simulates muscles and skeletons in 3D as they interact with objects and themselves — this is important for agents to learn how to properly hold and manipulate things without crushing or dropping them, and also in a virtual world provides realistic grips and interactions. It supposedly runs thousands of times faster on certain tasks, which lets simulated learning processes happen much quicker. “We’re going to open source these models so researchers can use them to advance the field further,” Zuck says. And they did!

Lots of these simulations are agent- or object-based, but this project from MIT looks at simulating an overall system of independent agents: self-driving cars. The idea is that if you have a good amount of cars on the road, you can have them work together not just to avoid collisions, but to prevent idling and unnecessary stops at lights.

Animation of cars slowing down at a 4-way intersection with a stoplight.

If you look closely, only the front cars ever really stop.

As you can see in the animation above, a set of autonomous vehicles communicating using v2v protocols can basically prevent all but the very front cars from stopping at all by progressively slowing down behind one another, but not so much that they actually come to a halt. This sort of hypermiling behavior may seem like it doesn’t save much gas or battery, but when you scale it up to thousands or millions of cars it does make a difference — and it might be a more comfortable ride, too. Good luck getting everyone to approach the intersection perfectly spaced like that, though.

Switzerland is taking a good, long look at itself — using 3D scanning tech. The country is making a huge map using UAVs equipped with lidar and other tools, but there’s a catch: the movement of the drone (deliberate and accidental) introduces error into the point map that needs to be manually corrected. Not a problem if you’re just scanning a single building, but an entire country?

[embedded content]

Fortunately, a team out of EPFL is integrating an ML model directly into the lidar capture stack that can determine when an object has been scanned multiple times from different angles and use that info to line up the point map into a single cohesive mesh. This news article isn’t particularly illuminating, but the paper accompanying it goes into more detail. An example of the resulting map is visible in the video above.

Lastly, in unexpected but highly pleasant AI news, a team from the University of Zurich has designed an algorithm for tracking animal behavior so zoologists don’t have to scrub through weeks of footage to find the two examples of courting dances. It’s a collaboration with the Zurich Zoo, which makes sense when you consider the following: “Our method can recognize even subtle or rare behavioral changes in research animals, such as signs of stress, anxiety or discomfort,” said lab head Mehmet Fatih Yanik.

So the tool could be used both for learning and tracking behaviors in captivity, for the well-being of captive animals in zoos, and for other forms of animal studies as well. They could use fewer subject animals and get more information in a shorter time, with less work by grad students poring over video files late into the night. Sounds like a win-win-win-win situation to me.

Illustration of monkeys in a tree being analyzed by an AI.

Image Credits: Ella Marushenko / ETH Zurich

Also, love the illustration.

Technology

Tesla more than tripled its Austin gigafactory workforce in 2022

Published

on

Tesla’s 2,500-acre manufacturing hub in Austin, Texas tripled its workforce last year, according to the company’s annual compliance report filed with county officials. Bloomberg first reported on the news.

The report filed with Travis County’s Economic Development Program shows that Tesla increased its Austin workforce from just 3,523 contingent and permanent employees in 2021 to 12,277 by the end of 2022. Bloomberg reports that just over half of Tesla’s workers reside in the county, with the average full-time employee earning a salary of at least $47,147. Outside of Tesla’s factory, the average salary of an Austin worker is $68,060, according to data from ZipRecruiter.

TechCrunch was unable to acquire a copy of the report, so it’s not clear if those workers are all full-time. If they are, Tesla has hired a far cry more full-time employees than it is contracted to do. According to the agreement between Tesla and Travis County, the company is obligated to create 5,001 new full-time jobs over the next four years.

The contract also states that Tesla must invest about $1.1 billion in the county over the next five years. Tesla’s compliance report shows that the automaker last year invested $5.81 billion in Gigafactory Texas, which officially launched a year ago at a “Cyber Rodeo” event. In January, Tesla notified regulators that it plans to invest another $770 million into an expansion of the factory to include a battery cell testing site and cathode and drive unit manufacturing site. With that investment will come more jobs.

Tesla’s choice to move its headquarters to Texas and build a gigafactory there has helped the state lead the nation in job growth. The automaker builds its Model Y crossover there and plans to build its Cybertruck in Texas, as well. Giga Texas will also be a model for sustainable manufacturing, CEO Elon Musk has said. Last year, Tesla completed the first phase of what will become “the largest rooftop solar installation in the world,” according to the report, per Bloomberg. Tesla has begun on the second phase of installation, but already there are reports of being able to see the rooftop from space. The goal is to generate 27 megawatts of power.

Musk has also promised to turn the site into an “ecological paradise,” complete with a boardwalk and a hiking/biking trail that will open to the public. There haven’t been many updates on that front, and locals have been concerned that the site is actually more of an environmental nightmare that has led to noise and water pollution. The site, located at the intersection of State Highway 130 and Harold Green Road, east of Austin, is along the Colorado River and could create a climate catastrophe if the river overflows.

The site of Tesla’s gigafactory has also historically been the home of low-income households and has a large population of Spanish-speaking residents. It’s not clear if the jobs at the factory reflect the demographic population of the community in which it resides.

Continue Reading

Technology

Launch startup Stoke Space rolls out software tool for complex hardware development

Published

on

Stoke Space, a company that’s developing a fully reusable rocket, has unveiled a new tool to let hardware companies track the design, testing and integration of parts. The new tool, Fusion, is targeting an unsexy but essential aspect of the hardware workflow.

It’s a solution born out of “ubiquitous pain in the industry,” Stoke CEO Andy Lapsa said in a recent interview. The current parts tracking status quo is marked by cumbersome, balkanized solutions built on piles of paperwork and spreadsheets. Many of the existing tools are not optimized “for boots on the ground,” but for finance or procurement teams, or even the C-suite, Lapsa explained.

In contrast, Fusion is designed to optimize simple inventory transactions and parts organization, and it will continue to track parts through their lifespan: as they are built into larger assemblies and go through testing. In an extreme example, such as hardware failures, Fusion will help teams connect anomalous data to the exact serial numbers of the parts involved.

Image credit: Stoke Space

“If you think about aerospace in general, there’s a need and a desire to be able to understand the part pedigree of every single part number and serial number that’s in an assembly,” Lapsa said. “So not only do you understand the configuration, you understand the history of all of those parts dating back to forever.”

While Lapsa clarified that Fusion is the result of an organic in-house need for better parts management – designing a fully reusable rocket is complicated, after all – turning it into a sell-able product was a decision that the Stoke team made early on. It’s a notable example of a rocket startup generating pathways for revenue while their vehicle is still under development.

Fusion offers particular relevance to startups. Many existing tools are designed for production runs – not the fast-moving research and development environment that many hardware startups find themselves, Lapsa added. In these environments, speed and accuracy are paramount.

Brent Bradbury, Stoke’s head of software, echoed these comments.

“The parts are changing, the people are changing, the processes are changing,” he said. “This lets us capture all that as it happens without a whole lot of extra work.”

Continue Reading

Technology

Amid a boom in AI accelerators, a UC Berkeley-focused outfit, House Fund, swings open its doors

Published

on

Companies at the forefront of AI would naturally like to stay at the forefront, so it’s no surprise they want to stay close to smaller startups that are putting some of their newest advancements to work.

Last month, for example, Neo, a startup accelerator founded by Silicon Valley investor Ali Partovi, announced that OpenAI and Microsoft have offered to provide free software and advice to companies in a new track focused on artificial intelligence.

Now, another Bay Area outfit — House Fund, which invests in startups with ties to UC Berkeley — says it is launching an AI accelerator and that, similarly, OpenAI, Microsoft, Databricks, and Google’s Gradient Ventures are offering participating startups free and early access to tech from their companies, along with mentorship from top AI founders and executives at these companies.

We talked with House Fund founder Jeremy Fiance over the weekend to get a bit more color about the program, which will replace a broader-based accelerator program House Fund has run and whose alums include an additive manufacturing software company, Dyndrite, and the managed app development platform Chowbotics, whose most recent round in January brought the company’s total funding to more than $60 million.

For founders interested in learning more, the new AI accelerator program runs for two months, kicking off in early July and ending in early September. Six or so companies will be accepted, with the early application deadline coming up next week on April 13th. (The final application deadline is on June 1.) As for the time commitment involved across those two months, every startup could have a different experience, says Fiance. “We’re there when you need us, and we’re good at staying out of the way.”

There will be the requisite kickoff retreat to spark the program and founders to get to know one another. Candidates who are accepted will also have access to some of UC Berkeley’s renowned AI professors, including Michael Jordan, Ion Stoica, and Trevor Darrell. And they can opt into dinners and events in collaboration with these various constituents.

As for some of the financial dynamics, every startup that goes through the program will receive a $1 million investment on a $10 million post-money SAFE note. Importantly, too, as with the House Fund’s venture dollars, its AI accelerator is seeking startups that have at least one Berkeley-affiliated founder on the co-founding team. That includes alumni, faculty, PhDs, postdocs, staff, students, dropouts, and other affiliates.

There is no demo day. Instead, says Fiance, founders will receive “directed, personal introductions” to the VCs who best fit with their startups.

Given the buzz over AI, the new program could supercharge House Fund, the venture organization, which is already growing fast. Fiance launched it in 2016 with just $6 million and it now manages $300 million in assets, including on behalf of Berkeley Endowment Management Company and the University of California.

At the same time, the competition out there is fierce and growing more so by the day.

Though OpenAI has offered to partner with House Fund, for example, the San Francisco-based company announced its own accelerator back in November. Called Converge, the cohort was to be made up of 10 or so founders who received $1 million each and admission to five weeks of office hours, workshops and other events that ended and that received their funding from the OpenAI Startup Fund.

Y Combinator, the biggest accelerator in the world, is also oozing with AI startups right now, all of them part of a winter class that will be talking directly with investors this week via demo days that are taking place tomorrow, April 5th, and on Thursday.

Continue Reading

Trending