Where Industry Meets Innovation

  • Contact Us
  • sign in Sign In
  • Sign in with certificate
mit campus

Past Conferences

Conference Details - Agenda

ILP Conference overview header image

Autonomy 2020 Conference

Vehicles, Manufacturing, and Platform Technologies
April 8-9, 2020
Day 01 Day 02 | All
 

Day 1: Wednesday, April 8, 2020

8:00 - 9:00

Registration & Breakfast

9:00 - 9:15

Welcome Remarks & MIT Innovation Ecosystem

Session 1: Autonomous Vehicles

9:15 - 9:45

TBA

9:45 - 10:15

Automated Vehicles: The Current Landscape and a View Forward
The concept of automating vehicles and removing the driver from direct control of the throttle, brake, and steering wheel was first explored nearly 100 years ago. Over the decades since, automation of various features has gradually infiltrated the automobile. Today, on the heels of the DARPA Urban Challenge and Google’s Self-Driving Car Project, we are closer than ever to realizing aspirations of a century ago, but challenges remain. This talk will center on elements of what is known about automation in the vehicle today and our evolution towards self-driving. Topics will include: observations on the use of Advanced Driver Assistance Systems (ADAS) and production level automated driving features (Autopilot, Pilot Assist, Super Cruise, etc.); the shifting nature of what we do in modern vehicles, challenging what is today’s distraction - secondary tasks or driving; and key points to consider regarding the future of robots on our roads. How might the intersection of artificial intelligence embodied in one the most complex activities humans perform - intersect with society’s demand for economical, efficient and safe mobility? How can human factors insight, psychological research, and policy leadership help to accelerate innovations that will someday change how we live and move? How fast might the automated, electrified future of mobility really take hold?
Read More

10:15 - 10:45

Certifiable Perception for High-Integrity Autonomous Systems
Robot perception and computer vision have witnessed an unprecedented progress in the last decade. Robots and autonomous vehicles are now able to detect objects, localize them, and create large-scale maps of an unknown environment, which are crucial capabilities for navigation and manipulation. Despite these advances, both researchers and practitioners are well aware of the brittleness of current perception systems, and a large gap still separates robot and human perception. While many applications can afford occasional failures (e.g., AR/VR, domestic robotics), high-integrity autonomous systems (including self-driving vehicles) demand a new generation of algorithms. This talk discusses two efforts targeted at bridging this gap. The first focuses on robustness: I present recent advances in the design of certifiable perception algorithms that are robust to extreme amounts of outliers and afford performance guarantees. These algorithms are “hard to break” and are able to work in regimes where all related techniques fail. The second effort targets high-level understanding. While humans are able to quickly grasp both geometric and semantic aspects of a scene, high-level scene understanding remains a challenge for robotics. I present recent work on real-time metric-semantic understanding, which combines robust estimation with deep learning.
Read More

10:45 - 11:05

Networking Break

11:05 - 11:35

Challenges and Opportunities in Automated Driving
This talk will describe some of the challenges and opportunities in autonomy research today, with a focus on trends and lessons in self-driving research. We will discuss some of the major challenges and research opportunities in self-driving, including building and maintaining high-resolution maps, interacting with humans both inside and outside of vehicles, dealing with adverse weather, and achieving sufficiently high detection with low probabilities of false alarms in challenging settings. We will review the different approaches to automated driving, including SAE Level 2 and SAE Level 4 systems, as well as the Toyota Guardian approach, which flips the conventional mindset from having the human guard the AI (as in SAE Level 2 systems) to instead using AI to guard the human driver. We will discuss research opportunities in mapping, localization, perception, prediction, and planning and control to realize improved safety through advanced automation in the future.
Read More

11:35 - 11:40

MIT Professional Education

11:40 - 12:35

MIT Startup Exchange Overview & Lightning Talks

MIT Startup Exchange actively promotes collaboration and partnerships between MIT-connected startups and industry. Qualified startups are those founded and/or led by MIT faculty, staff, or alumni, or are based on MIT-licensed technology. Industry participants are principally members of MIT’s Industrial Liaison Program (ILP).

MIT Startup Exchange is a community of over 1,800 MIT-connected startups with roots across MIT departments, labs and centers; it hosts a robust schedule of startup workshops and showcases, and facilitates networking and introductions between startups and corporate executives.

STEX25 is a startup accelerator within MIT Startup Exchange, featuring 25 “industry ready” startups that have proven to be exceptional with early use cases, clients, demos, or partnerships, and are poised for significant growth. STEX25 startups receive promotion, travel, and advisory support, and are prioritized for meetings with ILP’s 260 member companies.

MIT Startup Exchange and ILP are integrated programs of MIT Corporate Relations.
Read More

Lightning Talks Part I: Mobility
- Perceptive Automata: AI understanding of human intent for autonomy
- BlinkAI: Imaging AI for autonomy, robotics, sensing
- General Radar: High resolution 4D radar for autonomous machines
- NextDroid: Autonomously measure & predict self-driving capability
- Transit X: Public transit with automated pods on micro-guideways
- Cambridge Mobile Telematics: Making the world's roads and drivers safer

Lightning Talks Part II: Manufacturing & technology
- Akasha Imaging: Vision AI for optically challenging parts
- Realtime Robotics: Motion planning for autonomous robots & vehicles
- Everactive: Self-powered wireless industrial sensors
- Top Flight Technologies: Heavy lift, long range hybrid-electric UAVs
- Lightelligence: Photonic AI accelerator chip

12:35 - 14:00

Lunch with Startup Exhibits
Additional exhibiting startups:
- SemiKing: Long-range flash LiDAR for autonomous vehicles
- NODAR: Camera-based computer vision systems for automobiles
- Southie Autonomy: Flexible robotic automation with AR & AI
- blkSAIL: Marine autonomy as a service

14:00 - 14:30

Why fully driverless (with no operators) systems will start in geofenced environments
The self-driving industry has blurred the distinction between Level 4 and 5 (SAE) autonomy. The reality is that L5 autonomy -- the ability to autonomously in every speed and environmental complexity -- is far from being achieved robustly, reliability, and without safety operators on board the vehicle. An alternative approach is to start in "Geo-fenced" environments such as a corporate/university campus or masterplanned community where the Operating Design Domain (ODD) presents a realistic path to a fully driverless solution (Level 4). This talk discusses the challenges and opportunities in achieving this industry-wide goal.
Read More

Session 2: Autonomous Manufacturing

14:30 - 15:00

Autonomous Recursion
We will discuss research on the essential recursion that is at the heart of autonomy, from assemblers that assemble assemblers, to machines that make machines, to systems that design systems. Then we will explore applications of embodying intelligence in autonomous systems in areas including exponential manufacturing, rapid automation, physical reconfigurability, and personal fabrication.

15:00 - 15:30

TBA

15:30 - 15:50

Networking Break

15:50 - 16:20

Myths of Autonomy: A Future of Closer Collaboration Between Humans and Technology in Logistics & Manufacturing
Total transition to full autonomy in manufacturing is unlikely. While “lights out”, fully-automated factories requiring no input have long been a utopian/dystopian vision for the future, even the most automated electronics or production plants still require a large number of workers to set up, maintain, repair, and spearhead the innovation of equipment. Production systems must constantly adapt to rapidly changing conditions. With current technology and even developments in AI, human presence is often superior at providing that flexibility - which will likely remain the case for years to come.

In this talk, David Mindell, MIT Professor and CEO/Founder of Humatics, will discuss how automation has evolved the manufacturing industry, the critical technologies playing a role in this transformation – including the Humatics microlocation platform, which is driving productivity and safety by providing full visibility into intralogistics vehicle operations – and why full autonomy in manufacturing is a distant, unlikely future.
Read More

16:20 - 16:50

The Role of Tactile Sensing in Automated Part Picking, Handling, and Assembly
Tactile sensing plays a privileged role in the manipulation chain: it is in direct contact with the world, potentially offering direct observations of shape, motion and force at contact. This potential, however, is in contrast with today robot's limited tactile reasoning, a long-standing challenge in the robotics research community. After decades of advances in sensing instrumentation and processing power, the basic questions remains: How should robots make effective use of sensed contact information? In this talk I will describe efforts in my group to develop planning and control frameworks that exploit tactile feedback, and demonstrate use cases in automated part picking, part handling, and part assembly.
Read More

16:50 - 17:00

Concluding Remarks

17:00

Networking Reception

* All schedule and speakers are subject to change without notice.