Showing posts with label Autonomous cars. Show all posts
Showing posts with label Autonomous cars. Show all posts

Keeping it fresh for 35 years

By Megan Alink, Director of Marketing Communications for Automotive

Recently, my colleagues Paul Leroux and Matt Young showed off a shiny new infographic that enlightens readers to the many ways they encounter QNX-based systems in daily life (here and here). After three-and-a-half decades in business we’ve certainly been around the block a time or two, and you might think things are getting a bit stale. As the infographic shows, that couldn’t be further from the truth here at QNX. From up in the stars to down on the roads; in planes, trains, and automobiles (and boats too); whether you’re mailing a letter or crafting a BBM on your BlackBerry smartphone, the number and breadth of applications in which our customers deploy QNX technology is simply astounding.

For those who like some sound with their pictures, we also made a video to drive home the point that, wherever you are and whatever you do, chances are you’ll encounter a little QNX. Check it out:


“What do you mean, I have to learn how not to drive?”

The age of autonomous driving lessons is upon us.

Paul Leroux
What would it be like to ride in an autonomous car? If you were to ask the average Joe, he would likely describe a scenario in which he sips coffee, plays video games, and spends quality time with TSN while the car whisks him to work. The average Jane would, no doubt, provide an equivalent answer. The problem with this scenario is that autonomous doesn’t mean driverless. Until autonomous vehicles become better than humans at handling every potential traffic situation, drivers will have to remain alert much or all of the time, even if their cars do 99.9% of the driving for them.

Otherwise, what happens when a car, faced with a situation it can’t handle, suddenly cedes control to the driver? Or what happens when the car fails to recognize a pedestrian on the road ahead?

Of course, it isn’t easy to maintain a high level of alertness while doing nothing in particular. It takes a certain maturity of mind, or at least a lack of ADD. Which explains why California, a leader in regulations for autonomous vehicles, imposes restrictions on who is allowed to “drive” them. Prerequisites include a near-spotless driving record and more than 10 years without a DUI conviction. Drivers must also complete an autonomous driving program, the length of which depends on the car maker or automotive supplier in question. According to a recent investigation by IEEE Spectrum, Google offers the most comprehensive program — it lasts five weeks and subjects drivers to random checks.

1950s approach to improving driver
alertness. Source:
 
Modern Mechanix blog

In effect, drivers of autonomous cars have to learn how not to drive. And, as another IEEE article suggests, they may even need a special license.

Ample warnings
Could an autonomous car mitigate the attention issue? Definitely. It could, for example, give the driver ample warning before he or she needs to take over. The forward collision alerts and other informational ADAS functions in the latest QNX technology concept car offer a hint as to how such warnings could operate. For the time being, however, it’s hard to imagine an autonomous car that could always anticipate when it needs to cede control. Until then, informational ADAS will serve as an adjunct, not a replacement, for eyes, ears, and old-fashioned attentiveness.

Nonetheless, research suggests that adaptive cruise control and other technologies that enable autonomous or semi-autonomous driving can, when compared to human drivers, do a better job of avoiding accidents and improving traffic flow. To quote my friend Andy Gryc, autonomous cars would be more “polite” to other vehicles and be better equipped to negotiate inter-vehicle space, enabling more cars to use the same length of road.

Fewer accidents, faster travel times. I could live with that.


2015 approach to improving driver alertness: instrument cluster from the QNX reference vehicle.

New to 26262? Have I got a primer for you

Driver error is the #1 problem on our roads — and has been since 1869. In August of that year, a scientist named Mary Ward became the first person to die in an automobile accident, after being thrown from a steam-powered car. Driver error was a factor in Mary’s death and, 145 years later, it remains a problem, contributing to roughly 90% of motor vehicle crashes.

Can ADAS systems mitigate driver error and reduce traffic deaths? The evidence suggests that, yes, they help prevent accidents. That said, ADAS systems can themselves cause harm, if they malfunction. Imagine, for example, an adaptive cruise control system that underestimates the distance of a car up ahead. Which raises the question: how can you trust the safety claims for an ADAS system? And how do you establish that the evidence for those claims is sufficient?

Enter ISO 26262. This standard, introduced in 2011, provides a comprehensive framework for validating the functional safety claims of ADAS systems, digital instrument clusters, and other electrical or electronic systems in production passenger vehicles.

ISO 26262 isn’t for the faint of heart. It’s a rigorous, 10-part standard that recommends tools, techniques, and methodologies for the entire development cycle, from specification to decommissioning. In fact, to develop a deep understanding of 26262 you must first become versed in another standard, IEC 61508, which forms the basis of 26262.

ISO 26262 starts from the premise that no system is 100% safe. Consequently, the system designer must perform a hazard and risk analysis to identify the safety requirements and residual risks of the system being developed. The outcome of that analysis determines the Automotive Safety Integrity Level (ASIL) of the system, as defined by 26262. ASILs range from A to D, where A represents the lowest degree of hazard and D, the highest. The higher the ASIL, the greater the degree of rigor that must be applied to assure the system avoids residual risk.

Having determined the risks (and the ASIL) , the system designer selects an appropriate architecture. The designer must also validate that architecture, using tools and techniques that 26262 either recommends or highly recommends. If the designer believes that a recommended tool or technique isn’t appropriate to the project, he or she must provide a solid rationale for the decision, and must justify why the technique actually used is as good or better than that recommended by 26262.

The designer must also prepare a safety case. True to its name, this document presents the case that the system is sufficiently safe for its intended application and environment. It comprises three main components: 1) a clear statement of what is claimed about the system, 2) the argument that the claim has been met, and 3) the evidence that supports the argument. The safety case should convince not only the 26262 auditor, but also the entire development team, the company’s executives, and, of course, the customer. Of course, no system is safe unless it is deployed and used correctly, so the system designer must also produce a safety manual that sets the constraints within which the product must be deployed.

Achieving 26262 compliance is a major undertaking. That said, any conscientious team working on a safety-critical project would probably apply most of the recommended techniques. The standard was created to ensure that safety isn’t treated as an afterthought during final testing, but as a matter of due diligence in every stage of development.

If you’re a system designer or implementer, where do you start? I would suggest “A Developer’s View of ISO 26262”, an article recently authored by my colleague Chris Hobbs and published in EE Times Automotive Europe. The article provides an introduction to the standard, based on experience of certifying software to ISO 26262, and covers key topics such as ASILs, recommended verification tools and techniques, the safety case, and confidence from use.

I also have two whitepapers that may prove useful: Architectures for ISO 26262 systems with multiple ASIL requirements, written by my colleague Yi Zheng, and Protecting software components from interference in an ISO 26262 system, written by Chris Hobbs and Yi Zheng.

Driving simulators at CES

CES was just 15 minutes from closing when I managed to slip away from the very busy QNX booth to try out an F1 simulator. Three screens, 6 degrees of freedom, and surround sound came together for the most exciting simulated driving experience I have ever had. I was literally shaking when they dragged me out of the driver’s seat (I didn’t want to stop :-). Mind you, at around $80K for the system, it seems unlikely I will ever own one.

The experience got me thinking about the types of vehicles currently in simulation or in the lab that I fully expect to drive in my lifetime: cars that are virtually impossible to crash, cars that make it painless to travel long distances, and, ultimately, cars that worry about traffic jams so I can read a book.

Re-incarnated: The QNX reference
vehicle.
QNX Software Systems had a very popular simulator of its own at CES this year. You may have seen some details on it already but to recap, it is a new incarnation of our trusty QNX reference vehicle, extended to demonstrate ADAS capabilities. We parked it in front of a 12 foot display and used video footage captured on California’s fabled Highway 1 to provide the closest thing to real-world driving we could create.

The resulting virtual drive showcased the capabilities not only of QNX technology, but of our ecosystem as well. Using the video footage, we provided camera inputs to Itseez’ computer vision algorithms to demonstrate a working example of lane departure warning and traffic sign recognition. By capturing GPS data synchronized with the video footage, and feeding the result through Elektrobit’s Electronic Horizon Solution, we were able to generate curve speed warnings. All this was running on automotive-grade Jacinto 6 silicon from Texas Instruments. LiDAR technology from Phantom Intelligence rounded out the offering by providing collision feedback to the driver.

The lane departure and curve speed warnings in action. Screen-grab from video by Embedded Computing Design.

Meeting the challenge
While at CES, I also had the opportunity to meet with companies that are working to make advanced ADAS systems commercially viable. Phantom Intelligence is one example but I was also introduced to companies that can provide thermal imaging systems and near-infrared cameras at a fraction of what these technologies cost today.

These are all examples of how the industry is rising up to meet the challenge of safer, more autonomous vehicles at a price point that allows for widespread adoption in the foreseeable future. Amazing stuff, really — we are finally entering the era of the Jetsons.

By the way, I can’t remember what booth I was in when I drove the simulator. But I’m willing to bet that the people who experienced the Jeep at CES will remember they were in the QNX booth, seeing technology from QNX and its key partners in this exciting new world.

One day I’ll be Luke Skywalker

Cyril Clocher
What happens when you blend ADAS with infotainment? Guest post by Cyril Clocher, business manager for automotive processors at Texas Instruments

As we all begin preparing for our trek to Vegas for CES 2015, I would like my young friends (born in the 70s, of course) to reflect on their impressions of the first episode of Lucas’s trilogy back in 1977. On my side, I perfectly remember thinking one day I would be Luke Skywalker.

The eyes of young boys and girls were literally amazed by this epic space opera and particularly by technologies used by our heroes to fight the Galactic Empire. You have to remember it was an era where we still used rotary phones and GPS was in its infancy. So you can imagine how impactful it was for us to see our favorite characters using wireless electronic gadgets with revolutionary HMIs such as natural voice recognition, gesture controls or touch screens; droids speaking and enhancing human intelligence; and autonomous vehicles traveling the galaxy safely while playing chess with a Wookiee. Now you’re with me…

But instead of becoming Luke Skywalker a lot of us realized that we would have a bigger impact by inventing or engineering these technologies and by transforming early concepts into real products we all use today. As a result, smartphones and wireless connectivity are now in our everyday lives; the Internet of Things (IoT) is getting more popular in applications such as activity trackers that monitor personal metrics; and our kids are more used to touch screens than mice or keyboards, and cannot think of on-line gaming without gesture control. In fact, I just used voice recognition to upgrade the Wi-Fi plan from my Telco provider.

But the journey is not over yet. Our generation has still to deliver an autonomous vehicle that is green, safe, and fun to control – I think the word “drive” will be obsolete for such a vehicle.

The automotive industry has taken several steps to achieve this exciting goal, including integration of advanced and connected in-car infotainment systems in more models as well as a number of technologies categorized under Advanced Driver Assistance Systems (ADAS) that can create a safer and unique driving experience. From more than a decade, Texas Instruments has invested in infotainment and ADAS: “Jacinto” and TDAx automotive processors as well as the many analog companion chips supporting these trends.

"Jacinto 6 EP" and "Jacinto 6 Ex"
infotainment processor
s
A unique approach of TI is our capability to leverage best of both worlds for non-safety critical features, and to provide a seamless integration of informational ADAS functions into existing infotainment systems so the vehicle better informs and warns the driver. We announced that capability at SAE Convergence in Detroit in October 2014 with the “Jacinto 6 Ex” processor (DRA756), which contains powerful CPU, graphics multimedia, and radio cores with differentiated vision co-processors, called embedded vision engines (EVE), and additional DSPs that perform the complex ADAS processing.

For the TI’s automotive team, the CES 2015 show is even more exciting than in previous years, as we’ve taken our concept of informational ADAS to the next step. With joint efforts and hard work from both TI and QNX teams, we’ve together implemented a real informational ADAS system running the QNX CAR™ Platform for Infotainment on a “Jacinto 6 Ex” processor.

I could try describing this system in detail, but just like the Star Wars movies, it’s best to experience our “Jacinto 6 Ex” and QNX CAR Platform-based system in person. Contact your TI or QNX representative today and schedule a meeting to visit our private suite at CES at the TI Village (N115-N119) or to immerse yourself in a combined IVI, cluster, megapixel surround view, and DLP® based HUD display with augmented reality running on a single “Jacinto 6 Ex” SoC demonstration. And don't forget to visit the QNX booth (2231), where you can see the QNX reference vehicle running a variety of ADAS and infotainment applications on “Jacinto 6” processors.

Integrated cockpit featuring DLP powered HUD and QNX CAR Platform running on a single “Jacinto 6 Ex” SoC.
One day I’ll experience Skywalker’s life as I will no doubt have the opportunity to control an intelligent and autonomous vehicle with my biometrics, voice, and gestures while riding with my family to the movie theater playing chess with my grandkids, not yet a Wookiee.

A need for speed... and safety

Matt Shumsky
Matt Shumsky
For me, cars and safety go hand in hand. Don’t get me wrong, I have a need for speed. I do, after all, drive a 2006 compact with 140 HP (pause for laughter). But no one, and I mean no one, wants to be barreling down a highway in icy conditions at 120 km/hr without working brakes, am I right?

So this begs the question, what’s the best way to design a software system that ensures the adaptive cruise control system keeps a safe distance from the car ahead? Or that tells the digital instrument cluster the correct information to display? And how can you make sure the display information isn’t corrupted?

Enter QNX and the ISO 26262 functional safety standard.

QNX Software Systems is partnering with LDRA to present a webinar on “Ensuring Automotive Functional Safety”. During this webinar, you’ll learn about:
  • Development and verification tools proven to help provide safer automotive software systems
  • How suppliers can develop software systems faster with an OS tuned for automotive safety

Ensuring Automotive Functional Safety with QNX and LDRA
Thursday, November 20, 2014
9:00 am PST / 12:00 pm EST / 5:00 pm UTC

I hope you can join us!

Japan update: ADAS, wearables, integrated cockpits, and autonomous cars

Yoshiki Chubachi
Yoshiki Chubachi
Will the joy of driving be a design criterion for tomorrow’s vehicles? It had better be.

A couple of weeks ago, QNX Software Systems sponsored Telematics Japan in Tokyo. This event offers a great opportunity to catch up with colleagues from automotive companies, discuss technology and business trends, and showcase the latest technology demos. Speaking of which, here’s a photo of me with a Japan-localized demo of the QNX CAR Platform. You can also see a QNX-based digital instrument cluster in the lower-left corner — this was developed by Three D, one of our local technology partners:



While at the event, I spoke on the panel, “Evolving ecosystems for future HMI, OS, and telematics platform development.” During the discussion, we conducted a real-time poll and asked the audience three questions:

1) Do you think having Apple CarPlay and Android Auto will augment a vehicle brand?
2) Do you expect wearable technologies to be integrated into cars?
3) If your rental car were hacked, who would you complain to?

For question 1, 32% of the audience said CarPlay and Android Auto will improve a brand; 68% didn't think so. In my opinion, this result indicates that smartphone connectivity in cars is now an expected feature. For question 2, 76% answered that they expect to see wearables integrated into cars. This response gives us a new perspective — people are looking at wearables as a possible addition to go with ADAS systems. For example, a wearable device could help prevent accidents by monitoring the driver for drowsiness and other dangerous signs. For question 3, 68% said they would complain to the rental company. Mind you, this raises the question: if your own car were hacked, who would you complain to?

Integrated cockpits
There is growing concern around safety and security as companies attempt to grow more business by leveraging connectivity in cars. The trend is apparent if you look at the number of safety- and security-related demos at various automotive shows.

Case in point: I recently attended a private automotive event hosted by Renesas, where many ADAS and integrated cockpit demos were on display. And last month, CEATEC Japan (aka the CES of Japan) featured integrated cockpit demos from companies like Fujitsu, Pioneer, Mitsubishi, Kyocera, and NTT Docomo.

For the joy of it
Things are so different from when I first started developing in-car navigation systems 20 years ago. Infotainment systems are now turning into integrated cockpits. In Japan, the automotive industry is looking at early 2020s as the time when commercially available autonomous cars will be on the road. In the coming years, the in-car environment, including infotainment, cameras and other systems, will change immensely — I’m not exactly sure what cars in the year 2020 will look like, but I know it will be something I could never have imagined 20 years ago.

A panel participant at Telematics Japan said to me, “If autonomous cars become reality and my car is not going to let me drive anymore, I am not sure what the point of having a car is.” This is true. As we continue to develop for future cars, we may want to remind ourselves of the “joy of driving” factor.

Are you ready to stop micromanaging your car?

I will get to the above question. Honest. But before I do, allow me to pose another one: When autonomous cars go mainstream, will anyone even notice?

The answer to this question depends on how you define the term. If you mean completely and absolutely autonomous, with no need for a steering wheel, gas pedal, or brake pedal, then yes, most people will notice. But long before these devices stop being built into cars, another phenomenon will occur: people will stop using them.

Allow me to rewind. Last week, Tesla announced that its Model S will soon be able to “steer to stay within a lane, change lanes with the simple tap of a turn signal, and manage speed by reading road signs and using traffic-aware cruise control.” I say soon because these functions won't be activated until owners download a software update in the coming weeks. But man, what an update.

Tesla may now be at the front of the ADAS wave, but the wave was already forming — and growing. Increasingly, cars are taking over mundane or hard-to-perform tasks, and they will only become better at them as time goes on. Whether it’s autonomous braking, automatic parking, hill-descent control, adaptive cruise control, or, in the case of the Tesla S, intelligent speed adaptation, cars will do more of the driving and, in so doing, socialize us into trusting them with even more driving tasks.

Tesla Model S: soon with autopilot
In other words, the next car you buy will prepare you for not having to drive the car after that.

You know what’s funny? At some point, the computers in cars will probably become safer drivers than humans. The humans will know it, but they will still clamor for steering wheels, brake pedals, and all the other traditional accoutrements of driving. Because people like control. Or, at the very least, the feeling that control is there if you want it.

It’s like cameras. I would never think of buying a camera that didn’t have full manual mode. Because control! But guess what: I almost never turn the mode selector to M. More often than not, it’s set to Program or Aperture Priority, because both of these semi-automated modes are good enough, and both allow me to focus on taking the picture, not on micromanaging my camera.

What about you? Are you ready for a car that needs a little less micromanagement?

Domo arigato, for self-driving autos

Lynn Gayowski
Lynn Gayowski
When talk moves to autonomous cars, Google's self-driving car is often the first project that springs to mind. However, there are a slew of automakers with autonomous or semi-autonomous vehicles in development — Audi, BMW, General Motors, Mercedes-Benz, and Toyota, to name a few. And did you know that QNX has been involved with autonomous projects since 1997?

Let's begin at the beginning. Obviously the first step is to watch the 1983 Mr. Roboto music video. To quote selectively, "I've come to help you with your problems, so we can be free." As Styx aptly communicated with the help of synthesizers, robots have the potential to improve our lives. Current research predicts autonomous cars will reduce traffic collisions and improve traffic flow, plus drivers will be freed up for other activities.

So let's take a look at how QNX has been participating in the progress to self-driving vehicles.



The microkernel architecture of the QNX operating system provides an exemplary foundation for systems with functional safety requirements, and as you can see from this list, there are projects related to cars, underwater robots, and rescue vehicles.

Take a look at this 1997 video from the California Partners for Advanced Transportation Technology (PATH) and the National Automated Highway System Consortium (NAHSC) showing their automated driving demo — the first project referenced on our timeline. It's interesting that the roadway and driving issues mentioned in this video still hold true 17 years later.



We're estimating that practical use of semi-autonomous cars is still 4 years away and that fully autonomous vehicles won't be available to the general public for about another 10 years after that. So stay tuned to the QNX Auto Blog. I'm already envisioning a 30-year montage of our autonomous projects. With a stirring soundtrack by Styx.

The summer road trip of 2017 – Part II

Lynn Gayowski
Lynn Gayowski
Our series looking at how in-car technologies will transform your summer road trip continues with part II. 2017 is around the corner, and between now and then, automakers will introduce a bevy of new features that will make for a safer and more enjoyable summer road trip. In our first part, we looked at your road trip soundtrack, navigation, and mobile device connectivity. This week, we look at safety, acoustics, and autonomous cars as we cruise to the last exit for this blog series.

Staying safe
By 2017, we likely won’t have developed the technology to shrink your mechanic down to a size that allows you to perch one on your dashboard like a bobble-head, but many cars will have a “virtual mechanic.” This application will let you check lights, fluids, tire pressure and other system vitals, all through your center stack, digital instrument cluster, or phone – as seen below. The idea of a safety speedometer is hardly new in concept (see the Plymouth safety speedometer from 1939), but its modern implementation in the cars of 2017 in the form of vision systems performing road sign detection might just mean fewer speeding tickets on your road trip, especially as you cruise through unfamiliar areas. 



Staying in touch
Sometimes you want to take a road trip to get away from the world, but sometimes you still want (or need) to stay connected. Whether it’s phone calls, texts, or emails, all of this information will continue to be seamlessly integrated into your car in 2017. Less fumbling, fewer distractions.


And low-quality, stilted speakerphone calls will be a thing of the past with the emerging crop of acoustic technologies. Driving alone on a stretch of road and miss having your loved ones close by? Advanced duplex technology will make it seem as though the person on the other end of your phone conversation is sitting right beside you in the passenger’s seat.  


Another cool development? You won’t have to struggle to use voice recognition technologies because of your noisy in-car cabin (that’s right, serenely quiet cabins will no longer be exclusive to luxury cars). Vehicles will continue to evolve to meet the strictest CAFÉ and emissions standards, while the negative acoustic side-effects from less damping materials will be countered using software to remove unwanted engine sound. And your engine in 2017 might really sound like purring (or growling, if that’s your preference), as signature sounds are enabled by engine sound enhancement software. So not only will you not feel crazy for talking to your car, you’ll also be less frustrated as you do so cruising down the interstate. 


Beyond 2017: Look ma, no hands!
While it won’t happen quite as soon as 2017, autonomous cars will hit the roads in the relatively near future, forever changing the dynamic of the road trip. Will road trips be more accessible for the elderly and others who can’t physically drive long distances? Will the new meaning of "cruise control" make the road trip more or less enjoyable? All of these considerations are up for discussion. One thing is certain: many of the advanced safety systems of today and 2017 are precursors to cars that could drive themselves. One such example of what the future of autonomous driving will look like is the University of Parma’s DEEVA autonomous car project being developed by the Artificial Vision and Intelligent Systems Laboratory (VisLab).  


How is in-car technology playing a role in your current summer road trip? How do you want it to improve your future road trips? Stay tuned to our QNX_Auto Twitter account and Facebook page for weekly discussions throughout the rest of the summer about 2017 has in store for your road trip.

Acoustics, ADAS, and autonomous cars, oh my!

Lynn Gayowski
Lynn Gayowski
Trying to make sense of where automotive technology is headed can be as tricky as finding your way through a poppy field while avoiding flying monkeys. Well strap on your shiny, red, video-watching shoes because Derek Kuhn can help. Derek, VP marketing and sales for QNX, was interviewed at Telematics Detroit and did an excellent job of summing up the latest on automotive acoustics, advanced driver assistance systems (ADAS), and autonomous cars.

QNX announced the new QNX OS for Automotive Safety at Telematics Detroit, so safety was clearly top of mind during the interview. One question posed was whether automakers have the potential to use safety options as revenue generators. There's a quote here I love: "Safety shouldn't be about premium." OEMs need to find cost-effective ways to bring next-generation safety to the mass market, not just luxury vehicles.

The section of the video I find most interesting is when Derek discusses how acoustics in a car play a big role in creating "the emotion and experience of driving." Noise reduction technology and engine sound enhancement both have a significant impact on a driver's affinity for a vehicle, and OEMs are taking note.

Check out the video for yourself here, my pretties:



Talking safety in Novi

Grant Courville
Last week, I had the pleasure of participating in a panel at Telematics Update's Advanced Automotive Safety Conference in Novi, Michigan. A key theme of the panel was — you guessed it — safety.

The two-day event brought together automakers, suppliers, government representatives, research groups, integrators, analysts, and educational institutions to discuss the latest standards and innovations in automotive safety and V2X. The show covered all aspects of vehicle connectivity, as well as the relationship of big data and cloud connectivity to automotive security.

The themes of reliability, security, and safety were front and center in my panel, “Automated Vehicles: The Stepping Stone to Autonomous Driving.” The panel was chaired by IHS Automotive and included experts from DENSO, Ricardo Inc., and the National Advanced Driving Simulator. Everyone on the panel agreed that interoperability and standardization are critical to accelerating innovation, and that ADAS systems are paving the path to autonomous driving.

All in all, the show was an informative event that helped identify the next steps in automotive safety — a topic near and dear to the QNX auto team.


Grant Courville is director of product management at QNX Software Systems.

A matter of urgency: preparing for ISO 26262 certification

Yoshiki Chubachi
Yoshiki Chubachi
Guest post by Yoshiki Chubachi, automotive business development manager for QNX Software Systems, Japan

Two weeks ago in Tokyo, QNX Software Systems sponsored an ISO 26262 seminar hosted by IT Media MONOist, a Japanese information portal for engineers. This was the fourth MONOist seminar to focus on the ISO 26262 functional safety standard, and the theme of the event conveyed an unmistakable sense of urgency: “You can’t to afford to wait any longer: how you should prepare for ISO 26262 certification”.

In his opening remarks, Mr. Pak, a representative of MONOist, noted that the number of attendees for this event increases every year. And, as the theme suggests, many engineers in the automotive community feel a strong need to get ready for ISO26262. In fact, registration filled up just three days after the event was announced.

The event opened with a keynote speech by Mr. Koyata of the Japan Automobile Research Institute (JARI), who spoke on functional safety as a core competency for engineers. A former engineer at Panasonic, Mr. Koyata now works as an ISO 26262 consultant at JARI. In his speech, he argued that every automotive developer should embrace knowledge of ISO 26262 and that automakers and Tier 1 suppliers should adopt a functional "safety culture." Interestingly, his argument aligns with what Chris Hobbs and Yi Zheng of QNX advocate in their paper, “10 truths about building safe embedded software systems.” My Koyata also discussed the difference between safety and ‘Hinshitu (Quality)” which is a strong point of Japan industry.

Next up were presentations by the co-sponsor DNV Business Assurance Japan. The talks focused on safety concepts and architecture as well as on metrics for hardware safety design for ISO 26262.

I had the opportunity to present on software architecture and functional safety, describing how the QNX microkernel architecture can provide an ideal system foundation for automotive systems with functional safety requirements. I spoke to a number of attendees after the seminar, and they all recognized the need to build an ISO 26262 process, but didn’t know how to start. The need, and opportunity, for education is great.

Yoshiki presenting at the MONOist ISO 26262 seminar. Source: MONOist

The event ended with a speech by Mr. Shiraishi of Keio University. He has worked on space satellite systems and offered some interesting comparisons between the functional safety of space satellites and automotive systems.

Safety and reliability go hand in hand. “Made in Japan” is a brand widely known for its reliability. Although Japan is somewhat behind when it comes to awareness for ISO 26262 certification, I see a great potential for it to be the leader in automotive safety. Japanese engineers take pride in the reliability of products they build, and this mindset can be extended to the new generation of functional safety systems in automotive.


Additional reading

QNX Unveils New OS for Automotive Safety
Architectures for ISO 26262 systems with multiple ASIL requirements (whitepaper)
Protecting Software Components from Interference in an ISO 26262 System (whitepaper)
Ten Truths about Building Safe Embedded Software Systems (whitepaper)

(My latest) top 12 articles on robot cars

Human error accounts for 9 out of 10 vehicle accidents. That alone is a compelling argument for building more autonomy into cars. After all, a robot car won't get moody or distracted, but will remain alert at all times. Moreover, it will respond quickly and consistently to dangerous situations, if programmed correctly. The problem, of course, is that it will respond, and you may not always be happy with the decisions it makes.

For instance, what happens if 5 children playing tag suddenly run in front of your robot car — should it opt for the greater good and avoid them, even if that puts you in mortal danger? Or should it hand over control and let you decide? Some would argue that such questions are moot, for the simple reason that autonomous cars may significantly reduce accidents overall. Nonetheless, these questions go the heart of how we see ourselves in relation to the machines we use every day. They demand discussion.

Speaking of discussion, I'd love to hear your thoughts on any of these articles. I don't agree with everything they say, but they certainly got me thinking. I think they'll do the same for you.

  • The Psychology Of Anthropomorphic Robots (Fast Company) — Convincing people to trust a self-driving car is surprisingly easy: just give it a cute face and a warm voice.
     
  • The Robot Car of Tomorrow May Just Be Programmed to Hit You (WIRED) — In a situation where a robot car must hit either of two vehicles, should it hit the vehicle with the better crash rating? If so, wouldn't that penalize people for buying safer cars? A look at why examining edge cases is important in evaluating crash-avoidance algorithms.
     
  • The Ethics of Autonomous Cars (The Atlantic) — Will your robot car know when to follow the law — and when to break it? And who gets to decide how your car will decide?
     
  • IEET Readers Divided on Robot Cars That Sacrifice Drivers’ Lives (IEET) — In response to the above story, the Institute for Ethics and Emerging Technologies asked its readers whether a robot car should sacrifice the driver's life to save the lives of others. Not everyone was convinced.
     
  • How to Make Driverless Cars Behave (TIME) — Did you know that Stanford’s CARS group has already developed tools to help automakers code morality into their cars? Yeah, I didn’t either. On the other hand, if driverless cars lead to far fewer accidents overall, will they even need embedded morality?
     
  • When 'Driver' Goes the Way of 'Computer' (The Atlantic) — Many of us imagine that autonomous vehicles will look and feel a lot like today’s cars. But guess what: once the human driver is out of the picture, long-standing assumptions about how cars are designed go out the proverbial window.
     
  • The end of driving (as we know it) (Fortune) — In Los Angeles, people drive 300 million miles every day. Now imagine if they could spend some or all of that time doing something else.
     
  • A Path Towards More Sustainable Personal Mobility (Stanford Energy Club) — If you find the Los Angeles statistic startling, consider this: every year in the US, light duty vehicles travel three trillion passenger miles — that’s 3x1012. Autonomous vehicles could serve as one element in a multi-pronged approach to reduce this number and help the environment.
     
  • How Shared Vehicles Are Changing the Way We Get Around (StreetsBlog USA) — If access is more important than ownership, will fleets of sharable autonomous cars translate into fewer cars on the road? The answer is yes, according to some research.
     
  • Driving revenues: Autonomous cars (EDN) — According to Lux Research, software accounts for a large fraction of the revenue opportunity in autonomous cars. Moreover, the car OS could be a differentiating factor for auto manufacturers.
     
  • Autonomous Vehicles Will Bring the Rise of 'Spam Cars' (Motherboard) — Though it would be a long, long time before this ever happened, the idea isn’t as goofy as you might think.
     
You can find my previous top 12 robo-car articles here.

12 autonomous car articles worth reading

You know what's fascinating about autonomous cars? Everything. They raise as many questions as they do answers, and many of those questions drive right to the heart of how we see ourselves and the world around us. For instance, will autonomous cars introduce a new era of independence for the elderly? Will they change the very nature of car ownership? Will they reduce traffic fatalities and help make traffic jams a thing of the past?

Technically, legally, economically, and socially, autonomous cars are a game-changer. I like thinking about them, and I like reading what other people think about them. And just what have I been reading? I thought you'd never ask. Here, in no particular order, are 12 articles that have caught my eye in the last month.

So there you have it. I don't, of course, agree with every point in every article, but they have all taught me something I didn't know or clarified something I already knew. I hope they do the same for you.

Self-driving cars? We had ‘em back in ‘56

Quick: What involves four-part harmony singing, control towers in the middle of the desert, and a dude smoking a fat stogie? Give up? It's the world of self-driving cars, as envisioned in 1956.

No question, Google’s self-driving car has captured the public imagination. But really, the fascination is nothing new. For instance, at the 1939 World’s Fair, people thronged to see GM’s Futurama exhibit, which depicted a world of cars controlled by radio signals. GM continued to promote its autonomous vision in the 1950s with the Firebird II, a turbine-powered car that could drive itself by following an "electronic control strip" embedded in the road. Here, for example, is a GM-produced video from 1956 in which a musically adept family goes for an autonomous drive:



Fast-forward to today, when it seems that everyone is writing about self-driving cars. Most articles don’t add anything new to the discussion, but their ubiquity suggests that, as a society, we are preparing ourselves for a future in which we give up some degree of control to our vehicles. I find it fascinating that an automaker was at the avant-garde of this process as far back as the 1930s. Talk about looking (way) ahead.

And you know what’s cool? Comparing the vision of the good life captured in the above video with the vision captured in the “Imagined” video that QNX produced 56 years later. In both cases, autonomous drive forms part of the story. And in both cases, an autonomous car helps to bring family together, though in completely different ways. It seems that, no matter how much technology (and our vision of technology) changes, the things closest to our hearts never do:



One more thing. Did you notice how much the sets in the GM video look like something straight out of the Jetson’s, right down to the bubble-domed car? They did to me. Mind you, the video predates the Jetson’s by several years, so if anything, the influence was the other way around.


Frankenstein and the future networked car

So what do Frankenstein and the future networked car have in common, you ask? Simple: both are compelling stories brought to life in Geneva, Switzerland.

In Mary Shelley’s Frankenstein the creature is seen climbing Mont-Salève after having fled Geneva during a lightning storm:

“I thought of pursuing the devil; but it would have been in vain, for another flash discovered him to me hanging among the rocks of the nearly perpendicular ascent of Mont-Salève.”

Mont-Salève, overlooking Geneva
Photo: Benoit Kornmann
Of course, the future networked car is a very different type of story, but compelling nonetheless. The laboratory in this story is the ITU Symposium on The Future Networked Car being held within the Geneva Auto Show on March 5 to 6, where many new ideas will be brought to life by convening leaders and technical experts from the automotive and ICT communities.

The event, organized by the International Telecommunications Union (ITU), will consist of high-level dialogues and several technical sessions; these include a session on integrating nomadic devices in cars, where I will discuss how technology standards can help minimize driver distraction. The dialogues will cover road safety and innovation for the future car, and will feature key leaders such as the presidents of Fédération Internationale de l’Automobile (Jean Todt) and Infiniti (Johan de Nysschen). The technical sessions will explore automated driving, connected car use cases, emergency services, and, of course, nomadic device integration. Speakers for these sessions come from a mix of automakers, tier one suppliers, ICT companies, standards development organizations (SDOs), industry groups, and government agencies.

The symposium also includes a session jointly organized by the ITU and UNECE Inland Transport Committee that deals with the human factors and regulatory issues introduced by automated driving. This session is an encouraging sign that the ITU and UNECE will continue the collaboration they started last June (see my previous post, “UN agencies take major step towards international standards for driver distraction”).

Hope to see you in Geneva!

QNX at CES: The media’s take

No, CES isn’t over yet. But the technology concept cars showcased in the QNX booth have already stoked the interest of journalists attending the event — and even of some not attending the event. So here, in no particular order, are examples of what they're saying.

Oh, and I’ve added a couple of stories that aren’t strictly CES-related, but appeared this week. They were too relevant to pass up.

That's it for now. I aim to post more stories and videos early next week. Stay tuned.

What happens when autonomous becomes ubiquitous?

Seventeen ways in which the self-driving car will transform how we live.

Let’s speculate that at least 25% of cars on the road are autonomous — and that those cars are sufficiently advanced to operate without a human driver. Let’s also assume that the legal issues have been sorted out somehow.

How would this impact society?

  • The elderly could maintain their independence. Even if they have lost the ability to drive, they could still get groceries, go to appointments, visit family and friends, or just go for a drive.
     
  • Cars could chauffer intoxicated folks safely home — no more drunk drivers.
     
  • Municipalities could get rid of buses and trains, and replace them with fleets of vehicles that would pick people up and drop them off exactly where they want to go. Mass transit would become individual transit.
     
  • Car sharing would become more popular, as the cost could be spread among multiple people. Friends, family members, or neighbors could chip in to own a single car, reducing pollution as well as costs. The cars would shuffle themselves to where they are needed, depending on everyone’s individual needs.
     
  • Fewer vehicles would be produced, but they would be more expensive. This could drive some smaller automakers out of business or force more industry consolidation.
     
  • Cities could get rid of most parking lots and garages, freeing up valuable real estate for homes, businesses, or parks.
     
  • Taxi companies would either go out of business or convert over to autonomous piloted vehicles. Each taxi could be equipped with anti-theft measures, alerting police if, say, the taxi detects it is being boarded onto a truck.
     
  • We could have fewer roads with higher capacities. Self-directed cars would be better equipped to negotiate inter-vehicle space, being more “polite” to other vehicles; they would also enable greater traffic density.
     
  • Instead of creating traffic jams, heavy traffic would maintain a steady pace, since the vehicles would operate as a single platoon.
     
  • Autonomous cars could completely avoid roads under construction and scatter themselves evenly throughout the surrounding route corridors to minimize the impact on detour routes.
     
  • There would be no more hunting for parking spots downtown. Instead, people could tell their cars to go find a nearby parking spot and use their smartphones to summon the cars back once they’re ready to leave.
     
  • Concerts or sporting events would operate more smoothly, as cars could coordinate where they’re parking. The flow of vehicles exiting from events would be more like a ballet than a mosh pit.
     
  • Kids growing up with autonomous cars would enjoy a new level of independence. They could get to soccer games without needing mom or dad to drive them. Parents could program the car to drive the children to fixed destinations: sports game and home.
     
  • School buses could become a thing of the past. School boards could manage fleets of cars that would pick up the children as needed by geographic grouping.
     
  • You could send your car out for errands, and companies would spring up to cater to “driverless” cars. For example, you could set up your grocery list online and send your car to pick them up; a clerk would fill your car with your groceries when it shows up at the supermarket.
     
  • Rental car companies could start offering cars that come to you when you need them. Renting cars may become more popular than owning them, since people who drive infrequently could pay by the ride, as opposed to paying the capital cost of owning a vehicle.
     
  • Cars would become like living rooms and people would enjoy the ride like never before — reading, conversing, exercising, watching TV. Some people may even give up their home to adopt a completely mobile existence.
     

Top 10 challenges facing the ADAS industry

Tina Jeffrey
It didn’t take long. Just months after the release of the ISO 26262 automotive functional safety standard in 2011, the auto industry began to grasp its importance and adopt it in a big way. Safety certification is gaining traction in the industry as automakers introduce advanced driver assistance systems (ADAS), digital instrument clusters, heads-up displays, and other new technologies in their vehicles.

Governments around the world, in particular those of the United States and the European Union, are calling for the standardization of ADAS features. Meanwhile, consumers are demonstrating a readiness to adopt these systems to make their driving experience safer. In fact, vehicle safety rating systems are becoming a vital ‘go to’ information resource for new car buyers. Take, for example, the European New Car Assessment Programme Advanced (Euro NCAP Advanced). This organization publishes safety ratings on cars that employ technologies with scientifically proven safety benefits for drivers. The emergence of these ratings encourages automakers to exceed minimum statutory requirements for new cars.

Sizing the ADAS market
ABI Research claims that the global ADAS market, estimated at US$16.6 billion at the end of 2012, will grow to more than US$260 billion by the end of 2020, representing a CAGR of 41%. Which means that cars will ship with more of the following types of safety-certified systems:



The 10 challenges
So what are the challenges that ADAS suppliers face when bringing systems to market? Here, in my opinion, are the top 10:
  1. Safety must be embedded in the culture of every organization in the supply chain. ADAS suppliers can't treat safety as an afterthought that is tacked on at the end of development; rather, they must embed it into their development practices, processes, and corporate culture. To comply with ISO 26262, an ADAS supplier must establish procedures associated with safety standards, such as design guidelines, coding standards and reviews, and impact analysis procedures. It must also implement processes to assure accountability and traceability for decisions. These processes provide appropriate checks and balances and allow for safety and quality issues to be addressed as early as possible in the development cycle.
     
  2. ADAS systems are a collaborative effort. Most ADAS systems must integrate intellectual properties from a number of technology partners; they are too complex to be developed in isolation by a single supplier. Also, in a safety-certified ADAS system, every component must be certified — from the underlying hardware (be it a multi-core processor, GPU, FPGA, or DSP) to the OS, middleware, algorithms, and application code. As for the application code, it must be certified to the appropriate automotive safety integrity level; the level for the ADAS applications listed above is typically ASIL D, the highest level of ISO 26262 certification.
     
  3. Systems may need to comply with multiple industry guidelines or specifications. Besides ISO 26262, ADAS systems may need to comply with additional criteria, as dictated by the tier one supplier or automaker. On the software side, these criteria may include AUTOSAR or MISRA. On the hardware side, they will include AEC-Q100 qualification, which involves reliability testing of auto-grade ICs at various temperature grades. ICs must function reliably over temperature ranges that span -40 degrees C to 150 degrees C, depending on the system.
     
  4. ADAS development costs are high. These systems are expensive to build. To achieve economies of scale, they must be targeted at mid- and low-end vehicle segments. Prices will then decline as volume grows and development costs are amortized, enabling more widespread adoption.
     
  5. The industry lacks interoperability specifications for radar, laser, and video data in the car network. For audio-video data alone, automakers use multiple data communication standards, including MOST (media-oriented system transport), Ethernet AVB, and LVDS. As such, systems must support a multitude of interfaces to ensure adoption across a broad spectrum of possible interfaces. Also, systems may need additional interfaces to support radar or lidar data.
     
  6. The industry lacks standards for embedded vision-processing algorithms. Ask 5 different developers to develop a lane departure warning system and you’ll get 5 different solutions. Each solution will likely start with a Matlab implementation that is ported to run on the selected hardware. If the developer is fortunate, the silicon will support image processing primitives (a library of functions designed for use with the hardware) to accelerate development. TI, for instance, has a set of image and video processing libraries (IMGLIB and VLIB) optimized for their silicon. These libraries serve as building blocks for embedded vision processing applications. For instance, IMGLIB has edge detection functions that could be used in a lane departure warning application.
     
  7. Data acquisition and data processing for vision-based systems is high-bandwidth and computationally intensive. Vision-based ADAS systems present their own set of technical challenges. Different systems require different image sensors operating at different resolutions, frame rates, and lighting conditions. A system that performs high-speed forward-facing driver assistance functions such as road sign detection, lane departure warning, and autonomous emergency breaking must support a higher frame rate and resolution than a rear-view camera that performs obstacle detection. (A rear-view camera typically operates at low speeds, and obstacles in the field of view are in close proximity to the vehicle.) Compared to the rear-view camera, an LDW, AEB, or RSD system must acquire and process more incoming data at a faster incoming frame rate, before signaling the driver of an unintentional lane drift or warning the driver that the vehicle is exceeding the posted speed limit.
     
  8. ADAS cannot add to driver distraction. There is an increase in the complexity of in-vehicle tasks and displays that can result in driver information overload. Systems are becoming more integrated and are presenting more data to the driver. Information overload could result in high cognitive workload, reducing situational awareness and countering the efficacy of ADAS. Systems must therefore be easy to use and should make use of the most appropriate modalities (visual, manual, tactile, sound, haptic, etc.) and be designed to encourage driver adoption. Development teams must establish a clear specification of the driver-vehicle interface early on in development to ensure user and system requirements are aligned.
     
  9. Environmental factors affect ADAS. ADAS systems must function under a variety of weather and lighting conditions. Ideally, vision-based systems should be smart enough to understand when they are operating in poor visibility scenarios such as heavy fog or snow, or when direct sunlight shines into the lens. If the system detects that the lens is occluded or that the lighting conditions are unfavorable, it can disable itself and warn the driver that it is non-operational. Another example is an ultrasonic parking sensor that becomes prone to false positives when encrusted with mud. Combining the results of different sensors or different sensor technologies (sensor fusion) can often provide a more effective solution than using a single technology in isolation.
     
  10. Testing and validating is an enormous undertaking. Arguably, testing and validation is the most challenging aspect of ADAS development, especially when it comes to vision systems. Prior to deploying a commercial vision system, an ADAS development team must amass hundreds if not thousands of hours of video clips in a regression test database, in an effort to test all scenarios. The ultimate goal is to achieve 100% accuracy and zero false positives under all possible conditions: traffic, weather, number of obstacles or pedestrians in the scene, etc. But how can the team be sure that the test database comprises all test cases? The reality is that they cannot — which is why suppliers spend years testing and validating systems, and performing extensive real-world field-trials in various geographies, prior to commercial deployment.
     
There are many hurdles to bringing ADAS to mainstream vehicles, but clearly, they are surmountable. ADAS systems are commercially available today, consumer demand is high, and the path towards widespread adoption is paved. If consumer acceptance of ADAS provides any indication of societal acceptance of autonomous drive, we’re well on our way.