The first-ever QNX technology concept car to hit CES

Paul Leroux
I bet you thought it was the Porsche. Or perhaps even the Bentley. But no, the first QNX-powered technology concept car to appear at CES was a digitally modded Prius — aka the LTE Connected Car. In fact, the car appeared at two CES shows: 2010 and 2011.

If you've never heard of the LTE Connected Car, it was a joint project of several companies, including QNX Software Systems and Alcatel-Lucent. The project members wanted to demonstrate how 4G/LTE networks could transform the driving experience and enable a host of new in-vehicle applications. This kind of thinking of may seem like old hat today, but when the car was created, telecom companies had yet to light up their first commercial LTE towers. The car was definitely ahead of its time.

One of the four infotainment
systems in the LTE Connected Car
Almost everyone saw the entertainment potential of equipping a car with a 4G/LTE broadband connection — the ability to access your favorite music, applications, videos, or social media while on the road had immediate appeal. But many people also saw the other value proposition this car presented: the ability for vehicles to continuously upload information they have gathered about themselves or surrounding road conditions, providing, in the words of WIRED's Eliot Van Buskirk, "a crowd-sourced version of what traffic helicopters do today." Awesome quote, that.

QNX provided the software foundation for the LTE Connected Car, including the OS, touchscreen user interfaces, media players for YouTube and Pandora, navigation system, Bluetooth connectivity, games, and handsfree integration. But why am I blabbing on about this when I could show you? Cue the screen captures...

Google local search
First up is Google local search, which displayed local points of interest to help drivers and passengers find nearby restaurants, gas stations, movie theaters, ATMs, hospitals, and so on. And because this was an LTE-enabled car, the system could fetch these POIs from a cloud-based database:



Pandora Internet radio
For those who prefer to listen to what they like, and nothing else, the car also came with a Pandora app:



Home monitoring and control
Are you the kind of person who forgets to engage the burglar alarm before going to work? If so, the car's home automation app was just the ticket. It could let you manage home systems, such as lights and thermostats, from any of the car’s touchscreens — you could even view a live video feed from home security cameras:



Vehicle diagnostics
Now this is my favorite part. If you look below, you'll see the car's main screen for accessing vehicle diagnostics. At the upper right is the virtual mechanic app, which retrieved OBD-II codes from the vehicle bus to display the status of your brakes, tires, power train, electrical systems, fluids, and so on. (The current QNX CAR Platform for Infotainment includes an updated version of this app.)



Low oil pressure... yikes!
The virtual mechanic wouldn't fix your car for you. But it could tell you when things were going south and help you take appropriate action — before the problem escalated. In this case, it's saying that the engine oil pressure is low:



What to do? Well, if you were mechanically challenged, you could tap the fuel pump icon at the bottom of the screen to display a map of local service stations. Or you could tap on the dealership icon (Toyota, in this case) and find directions to the nearest, well, dealership:



The virtual mechanic would also let you zoom in on specific systems. For instance, in the following screen, the user has tapped the brake fluid button to learn the location of the brake fluid reservoir:



On the subject of zooming, let's zoom out for a second to see the entire car:



Moving pictures
Screen captures and photos can say only so much. For the back story on the LTE Connected Car, check out this video, which digs into the "philosophy" of the car and what the project members were working to accomplish:





An LTE Connected Car reader

CES Cars of Fame

It’s that time of year again — and we’re not just talking about turkeys and Christmas trees. CES 2014 is right around the corner and QNX Software Systems will again be at the show, ready to unveil a new technology concept car.

For the past couple of years, we’ve driven into CES with cars that explore the future of automotive technology. Each car represents an important part of QNX history and because of this, we're excited to launch CES Cars of Fame. Each week, we’ll highlight a car on our blog, Twitter account, and Facebook page that we have showcased at CES. We’ll look at what made these cars so special and at the response they generated in the media and auto industry. And you get to participate, too: at the end of the series, you can vote for your favorite car!

We’re kicking things off on Tuesday, November 19. So stay tuned to this space and to @QNX_Auto on Twitter and to the QNX Software Systems Facebook page.

Top 10 lessons learned from more than a decade in automotive

Guest post by John Wall, vice president of QNX engineering and services

Ten years ago, software accounted for about 20 to 30 percent of the effort that went into an infotainment system. Today, some would argue that it’s upwards of 90 percent. This makes sense if you ask yourself, “Where are all the red, burning issues?” They’re not on hardware, they’re on software. “Where is all the money being spent?” Software.

A big challenge in today’s automotive industry is acquiring the knowledge and experience to manage the complexity, cost, and risk of this dramatic change.

We at QNX have had the good fortune to work closely with tier one suppliers and their OEM customers since 1999. We've had their development teams live with us for months at a time — sometimes years. And we've lived with them, working as integrated teams. The end result is our customers have learned a lot about the value we offer and we have learned a great deal about addressing their requirements.

Drawing from this experience, here are my ten biggest takeaways:

1. Commitment
Not delivering is not acceptable. You get only one chance, and there’s no margin for failure. If development of an infotainment system fails meet start-of-production deadlines, the car has to ship with a hole in the dash — or not at all. And if the system performs poorly, the OEM may end up having to use it. But you can be sure that the supplier won’t be invited back.

2. Trust
Trust is a huge part of the business. People need to trust that you will do what you say and that their car line is going to ship. They need to know that you take their business seriously.

3. Realism
You need to be realistic. It isn’t worth being too optimistic. In fact, you’ll do damage with overly optimistic dates that you don’t hit.

4. Investment
There’s a ‘show me’ attitude in automotive. You have to be prepared to invest up front. We know a lot of tier ones that are building prototypes on their own dime. This is especially true if you’re courting a new customer; you’ve got to put skin in the game.

5. Reputation
It’s a small world — another important lesson. The auto industry is a tight-knit community. People move around a lot. It’s not unusual to go to a tier one supplier and see people you met six months earlier at their competitor’s. So maintaining your reputation is very important; it follows you everywhere.

6. Reliability
You can’t rest on your laurels. You need to repeatedly and consistently help customers successfully cross the finish line.

7. Honesty
You have to be honest. Often, a customer will say, “I want X” and you have to say, “Well, you can’t have X”. And you have to provide a good explanation why.

8. Relevancy
Ultimately it’s the market that decides. You can have champions within a customer's organization — even the guy who makes all the decisions — but ultimately the company has to build what consumers want. They’re a business; they will go with what sells. Your job is to anticipate market demands and offer products that are relevant to the consumer.

9. Flexibility
The market is evolving — quickly. Customers have to track moving targets, like integration with the newest smartphone models, and still get a reliable product out on time. Your products and services must give them the flexibility and adaptability they need.

10. Passion
If you don’t have it, you don’t belong in this market. Automotive is complex, it’s fast moving, and it’s too deep for anyone who thinks they can simply test the waters. Succeeding in automotive demands a phenomenal level of discipline and commitment. But if you love it, the rewards are worth it.

RealVNC, QNX team up for mobile-to-vehicle connectivity

Paul Leroux
This just in: QNX and RealVNC have announced that they are collaborating to bring RealVNC’s implementation of the MirrorLink smartphone-to-vehicle connectivity standard to the QNX CAR Platform for Infotainment.

With RealVNC’s MirrorLink-certified SDK integrated in the QNX CAR Platform, QNX can offer a variety of connectivity features for integrating cars and smartphones through Wi-Fi, Bluetooth, and USB.

“We are delighted to work with QNX on integrating VNC Automotive into the QNX CAR Platform... many tier 1 and auto OEM customers are already using the proven combination of RealVNC and QNX technologies in production programs,” said Tom Blackie, VP Mobile RealVNC.

Read the full press on the QNX website.

What's the word on HTML5?

Ten videos on HTML5 in the car. Actually, there are only nine — but I'm getting ahead of myself.

Paul Leroux
Has it been two years already? In November 2011, a group of my QNX colleagues, including Andy Gryc, launched a video series on using HTML5 in the car. They realized that HTML5 holds enormous potential for automotive infotainment, from reducing industry fragmentation to helping head units keep pace with the blistering rate of change in the mobile industry. They also realized it was important to get the word out — to help people understand that the power of HTML5 extends far beyond the ability to create web pages. And so, they invited a variety of thought leaders and industry experts with HTML5 experience to stand in front of the camera and share their stories.

All of which to say, if you're interested in the future of HTML5 in the car, and in what thought leaders from companies such as OnStar, Audi, Gartner, Pandora, TCS, and QNX have to say about it, you've come to the right place. So let's get started, shall we?


Interview with Steve Schwinke of OnStar
Andy Gryc catches up with Steve Schwinke, director of advanced technology for OnStar, who is bullish on the both the short- and long-term benefits of HTML5:




Interview with Mathias Haliger of Audi
Derek Kuhn of QNX sits down with Mathias Haliger, head of MMI system architecture at Audi AG, who discusses the importance of HTML5 to his company and to the industry at large:




The analyst perspective: Thilo Koslowski of Gartner
Andy gets together with Thilo Koslowski, VP Distinguished Analyst at Gartner, to discuss the notion of controlled openness for the car — and how HTML5 fits into the picture:




Interview with Tom Conrad of Pandora
Andy meets up with Tom Conrad, CTO at Pandora, to get his take on the benefits of standardizing on HTML5 across markets:




Interview with Michael Camp of TCS
Andy Gryc sits down with Michael Camp, director of engineering for in-car telematics at TeleCommunication Systems (TCS), to get a software supplier's perspective on HTML5:




Interview with Matthew Staikos
Andy talks with Matthew Staikos, former web-technology manager at BlackBerry, about the impact of HTML5 on hardware options, memory usage, and app stores:




The myth buster interview
Andy and Kerry Johnson get together to discuss how HTML5 apps can deliver snappy performance, run without a Web browser, and even work without an Internet connection:




Interview with Sheridan Ethier
Andy drops in on Sheridan Ethier, manager of the QNX CAR Platform development team, to get a developer's perspective on HTML5:




Kickoff video
And last but not least, here is the video that started it all. Andy Gryc gives his take on why he believes HTML5 is destined to become the foundation for next-gen automotive apps:




Blooper video
Did I say last but not least? Sorry, I have one more video that you just have to see:




What happens when autonomous becomes ubiquitous?

Seventeen ways in which the self-driving car will transform how we live.

Let’s speculate that at least 25% of cars on the road are autonomous — and that those cars are sufficiently advanced to operate without a human driver. Let’s also assume that the legal issues have been sorted out somehow.

How would this impact society?

  • The elderly could maintain their independence. Even if they have lost the ability to drive, they could still get groceries, go to appointments, visit family and friends, or just go for a drive.
     
  • Cars could chauffer intoxicated folks safely home — no more drunk drivers.
     
  • Municipalities could get rid of buses and trains, and replace them with fleets of vehicles that would pick people up and drop them off exactly where they want to go. Mass transit would become individual transit.
     
  • Car sharing would become more popular, as the cost could be spread among multiple people. Friends, family members, or neighbors could chip in to own a single car, reducing pollution as well as costs. The cars would shuffle themselves to where they are needed, depending on everyone’s individual needs.
     
  • Fewer vehicles would be produced, but they would be more expensive. This could drive some smaller automakers out of business or force more industry consolidation.
     
  • Cities could get rid of most parking lots and garages, freeing up valuable real estate for homes, businesses, or parks.
     
  • Taxi companies would either go out of business or convert over to autonomous piloted vehicles. Each taxi could be equipped with anti-theft measures, alerting police if, say, the taxi detects it is being boarded onto a truck.
     
  • We could have fewer roads with higher capacities. Self-directed cars would be better equipped to negotiate inter-vehicle space, being more “polite” to other vehicles; they would also enable greater traffic density.
     
  • Instead of creating traffic jams, heavy traffic would maintain a steady pace, since the vehicles would operate as a single platoon.
     
  • Autonomous cars could completely avoid roads under construction and scatter themselves evenly throughout the surrounding route corridors to minimize the impact on detour routes.
     
  • There would be no more hunting for parking spots downtown. Instead, people could tell their cars to go find a nearby parking spot and use their smartphones to summon the cars back once they’re ready to leave.
     
  • Concerts or sporting events would operate more smoothly, as cars could coordinate where they’re parking. The flow of vehicles exiting from events would be more like a ballet than a mosh pit.
     
  • Kids growing up with autonomous cars would enjoy a new level of independence. They could get to soccer games without needing mom or dad to drive them. Parents could program the car to drive the children to fixed destinations: sports game and home.
     
  • School buses could become a thing of the past. School boards could manage fleets of cars that would pick up the children as needed by geographic grouping.
     
  • You could send your car out for errands, and companies would spring up to cater to “driverless” cars. For example, you could set up your grocery list online and send your car to pick them up; a clerk would fill your car with your groceries when it shows up at the supermarket.
     
  • Rental car companies could start offering cars that come to you when you need them. Renting cars may become more popular than owning them, since people who drive infrequently could pay by the ride, as opposed to paying the capital cost of owning a vehicle.
     
  • Cars would become like living rooms and people would enjoy the ride like never before — reading, conversing, exercising, watching TV. Some people may even give up their home to adopt a completely mobile existence.
     

My top moments of 2013 — so far

Paul Leroux
Yes, I know, 2013 isn’t over yet. But it’s been such a milestone year for our automotive business that I can’t wait another two months to talk about it. And besides, you’ll be busy as an elf at the end of December, visiting family and friends, skiing the Rockies, or buying exercise equipment to compensate for all those holiday carbs. Which means if I wait, you’ll never get to read this. So let’s get started.


We unveil a totally new (and totally cool) technology concept car
Times Square. We were there.
It all began at 2013 CES, when we took the wraps off the latest QNX technology concept car — a one-of-a-kind Bentley Continental GT. The QNX concept team outfitted the Bentley with an array of technologies, including a high-definition DLP display, a 3D rear-view camera, cloud-based voice recognition, smartphone connectivity, and… oh heck, just read the blog post to get the full skinny.

Even if you weren’t at CES, you could still see the car in action. Brian Cooley of CNET, Michael Guillory of Texas Instruments, the folks at Elektrobit, and Discovery Canada’s Daily Planet were just some of the individuals and organizations who posted videos. You could also connect to the car through a nifty web app. Heck, you could even see the Bentley’s dash on the big screen in Times Square, thanks to the promotional efforts of Elektrobit, who also created the 3D navigation software for the concept car.

We ship the platform
We wanted to drive into CES with all cylinders firing, so we also released version 2.0 of the QNX CAR Platform for Infotainment. In fact, several customers in the U.S., Germany, Japan, and China had already started to use the platform, through participation in an early access program. Which brings me to the next milestone...

Delphi boards the platform
The first of many.
Also at CES, Delphi, a global automotive supplier and long-time QNX customer, announced that version 2.0 of the QNX CAR Platform will form the basis of its next-generation infotainment systems. As it turned out, this was just one of several QNX CAR customer announcements in 2013 — but I’m getting ahead of myself.

We have the good fortune to be featured in Fortune
Fast forward to April, when Fortune magazine took a look at how QNX Software Systems evolved from its roots in the early 1980s to become a major automotive player. Bad news: you need a subscription to read the article on the Fortune website. Good news: you can read the same article for free on CNN Money. ;-)

A music platform sets the tone for our platform
In April, 7digital, a digital music provider, announced that it will integrate its 23+ million track catalogue with the QNX CAR Platform. It didn't take long for several other partners to announce their platform support. These include Renesas (R-Car system-on-chip for high-performance infotainment), AutoNavi (mobile navigation technology for the Chinese market), Kotei (navigation engine for the Japanese market), and Digia (Qt application framework).

We stay focused on distraction
Back in early 2011, Scott Pennock of QNX was selected to chair an ITU-T focus group on driver distraction. The group’s objective was serious and its work was complex, but its ultimate goal was simple: to help reduce collisions. This year, the group wrapped up its work and published several reports — but really, this is only the beginning of QNX and ITU-T efforts in this area.

We help develop a new standard
Goodbye fragmentation; hello
standard APIs.
Industry fragmentation sucks. It means everyone is busy reinventing the wheel when they could be inventing something new instead. So I was delighted to see my colleague Andy Gryc become co-chair of the W3C Automotive and Web Platform Business Group, which has the mandate to accelerate the adoption of web technologies in the car. Currently, the group is working to draft a standard set of JavaScript APIs for accessing vehicle data information. Fragmentation, thy days are numbered.

We launch an auto safety program
A two-handed approach to
helping ADAS developers.
On the one hand, we have a 30-year history in safety-critical systems and proven competency in safety certifications. On the other hand, we have deep experience in automotive software design. So why not join both hands together and allow auto companies to leverage our full expertise when they are building digital instrument clusters, advanced driver assistance systems (ADAS), and other in-car systems with safety requirements?

That’s the question we asked ourselves, and the answer was the new QNX Automotive Safety Program for ISO 26262. The program quickly drew support from several industry players, including Elektrobit, Freescale, NVIDIA, and Texas Instruments.

We jive up the Jeep
A tasty mix of HTML5 & Android
apps, served on a Qt interface,
with OpenGL ES on the side.
If you don’t already know, we use a Jeep Wrangler as our reference vehicle — basically, a demo vehicle outfitted with a stock version of the QNX CAR Platform. This summer, we got to trick out the Jeep with a new, upcoming version of the platform, which adds support for Android apps and for user interfaces based on the Qt 5 framework.

Did I mention? The platform runs Android apps in a separate application container, much like it handles HTML5 apps. This sandboxed approach keeps the app environment cleanly partitioned from the UI, protecting both the UI and the overall system from unpredictable web content. Good, that.

The commonwealth’s leader honors our leader
I only ate one piece. Honest.
Okay, this one has nothing to do with automotive, but I couldn’t resist. Dan Dodge, our CEO and co-founder, received a Queen Elizabeth II Diamond Jubilee Medal in recognition of his many achievements and contributions to Canadian society. To celebrate, we gave Dan a surprise party, complete with the obligatory cake. (In case you’re wondering, the cake was yummy. But any rumors suggesting that I went back for a second, third, and fourth piece are total fabrications. Honestly, the stories people cook up.)

Mind you, Dan wasn’t the only one to garner praise. Sheridan Ethier, the manager of the QNX CAR development team, was also honored — not by the queen, but by the Ottawa Business Journal for his technical achievements, business leadership, and community involvement.

Chevy MyLink drives home with first prize — twice
There's nothing better than going home with first prize. Except, perhaps, doing it twice. In January, the QNX-based Chevy MyLink system earned a Best of CES 2013 Award, in the car tech category. And in May, it pulled another coup: first place in the "Automotive, LBS, Navigation & Safe Driving" category of the 2013 CTIA Emerging Technology (E-Tech) Awards.

Panasonic, Garmin, and Foryou get with the platform
Garmin K2 platform: because
one great platform deserves
another.
August was crazy busy — and crazy good. Within the space of two weeks, three big names in the global auto industry revealed that they’re using the QNX CAR Platform for their next-gen systems. Up first was Panasonic, who will use the platform to build systems for automakers in North America, Europe, and Japan. Next was Foryou, who will create infotainment systems for automakers in China. And last was Garmin, who are using the platform in the new Garmin K2, the company’s infotainment solution for automotive OEMs.

And if all that wasn’t cool enough…

Mercedes-Benz showcases the platform
Did I mention I want one?
When Mercedes-Benz decides to wow the crowds at the Frankfurt Motor Show, it doesn’t settle for second best. Which is why, in my not so humble opinion, they chose the QNX CAR Platform for the oh-so-desirable Mercedes-Benz Concept S-Class Coupé.

Mind you, this isn’t the first time QNX and Mercedes-Benz have joined forces. In fact, the QNX auto team and Mercedes-Benz Research & Development North America have collaborated since the early 2000s. Moreover, QNX has supplied the OS for a variety of Mercedes infotainment systems. The infotainment system and digital cluster in the Concept S-Class Coupé are the latest — and arguably coolest — products of this long collaboration.

We create noise to eliminate noise
Taking a sound approach to
creating a quieter ride.
Confused yet? Don’t be. You see, it’s quite simple. Automakers today are using techniques like variable cylinder management, which cut fuel consumption (good), but also increase engine noise (bad). Until now, car companies have been using active noise control systems, which play “anti-noise” to cancel out the unwanted engine sounds. All fine and good, but these systems require dedicated hardware — and that makes them expensive. So we devised a software product, QNX Acoustics for Active Noise Control, that not only out-performs conventional solutions, but can run on the car’s existing audio or infotainment hardware. Goodbye dedicated hardware, hello cost savings.

And we flub our lines on occasion
Our HTML5 video series has given companies like Audi, OnStar, Gartner, TCS, and Pandora a public forum to discuss why HTML5 and other open standards are key to the future of the connected car. The videos are filled with erudite conversation, but every now and then, it becomes obvious that sounding smart in front of a camera is a little harder than it looks. So what did we do with the embarrassing bits? Create a blooper reel, of course.

Are these bloopers our greatest moments? Nope. Are they among the funniest? Oh yeah. :-)

Top 10 challenges facing the ADAS industry

Tina Jeffrey
It didn’t take long. Just months after the release of the ISO 26262 automotive functional safety standard in 2011, the auto industry began to grasp its importance and adopt it in a big way. Safety certification is gaining traction in the industry as automakers introduce advanced driver assistance systems (ADAS), digital instrument clusters, heads-up displays, and other new technologies in their vehicles.

Governments around the world, in particular those of the United States and the European Union, are calling for the standardization of ADAS features. Meanwhile, consumers are demonstrating a readiness to adopt these systems to make their driving experience safer. In fact, vehicle safety rating systems are becoming a vital ‘go to’ information resource for new car buyers. Take, for example, the European New Car Assessment Programme Advanced (Euro NCAP Advanced). This organization publishes safety ratings on cars that employ technologies with scientifically proven safety benefits for drivers. The emergence of these ratings encourages automakers to exceed minimum statutory requirements for new cars.

Sizing the ADAS market
ABI Research claims that the global ADAS market, estimated at US$16.6 billion at the end of 2012, will grow to more than US$260 billion by the end of 2020, representing a CAGR of 41%. Which means that cars will ship with more of the following types of safety-certified systems:



The 10 challenges
So what are the challenges that ADAS suppliers face when bringing systems to market? Here, in my opinion, are the top 10:
  1. Safety must be embedded in the culture of every organization in the supply chain. ADAS suppliers can't treat safety as an afterthought that is tacked on at the end of development; rather, they must embed it into their development practices, processes, and corporate culture. To comply with ISO 26262, an ADAS supplier must establish procedures associated with safety standards, such as design guidelines, coding standards and reviews, and impact analysis procedures. It must also implement processes to assure accountability and traceability for decisions. These processes provide appropriate checks and balances and allow for safety and quality issues to be addressed as early as possible in the development cycle.
     
  2. ADAS systems are a collaborative effort. Most ADAS systems must integrate intellectual properties from a number of technology partners; they are too complex to be developed in isolation by a single supplier. Also, in a safety-certified ADAS system, every component must be certified — from the underlying hardware (be it a multi-core processor, GPU, FPGA, or DSP) to the OS, middleware, algorithms, and application code. As for the application code, it must be certified to the appropriate automotive safety integrity level; the level for the ADAS applications listed above is typically ASIL D, the highest level of ISO 26262 certification.
     
  3. Systems may need to comply with multiple industry guidelines or specifications. Besides ISO 26262, ADAS systems may need to comply with additional criteria, as dictated by the tier one supplier or automaker. On the software side, these criteria may include AUTOSAR or MISRA. On the hardware side, they will include AEC-Q100 qualification, which involves reliability testing of auto-grade ICs at various temperature grades. ICs must function reliably over temperature ranges that span -40 degrees C to 150 degrees C, depending on the system.
     
  4. ADAS development costs are high. These systems are expensive to build. To achieve economies of scale, they must be targeted at mid- and low-end vehicle segments. Prices will then decline as volume grows and development costs are amortized, enabling more widespread adoption.
     
  5. The industry lacks interoperability specifications for radar, laser, and video data in the car network. For audio-video data alone, automakers use multiple data communication standards, including MOST (media-oriented system transport), Ethernet AVB, and LVDS. As such, systems must support a multitude of interfaces to ensure adoption across a broad spectrum of possible interfaces. Also, systems may need additional interfaces to support radar or lidar data.
     
  6. The industry lacks standards for embedded vision-processing algorithms. Ask 5 different developers to develop a lane departure warning system and you’ll get 5 different solutions. Each solution will likely start with a Matlab implementation that is ported to run on the selected hardware. If the developer is fortunate, the silicon will support image processing primitives (a library of functions designed for use with the hardware) to accelerate development. TI, for instance, has a set of image and video processing libraries (IMGLIB and VLIB) optimized for their silicon. These libraries serve as building blocks for embedded vision processing applications. For instance, IMGLIB has edge detection functions that could be used in a lane departure warning application.
     
  7. Data acquisition and data processing for vision-based systems is high-bandwidth and computationally intensive. Vision-based ADAS systems present their own set of technical challenges. Different systems require different image sensors operating at different resolutions, frame rates, and lighting conditions. A system that performs high-speed forward-facing driver assistance functions such as road sign detection, lane departure warning, and autonomous emergency breaking must support a higher frame rate and resolution than a rear-view camera that performs obstacle detection. (A rear-view camera typically operates at low speeds, and obstacles in the field of view are in close proximity to the vehicle.) Compared to the rear-view camera, an LDW, AEB, or RSD system must acquire and process more incoming data at a faster incoming frame rate, before signaling the driver of an unintentional lane drift or warning the driver that the vehicle is exceeding the posted speed limit.
     
  8. ADAS cannot add to driver distraction. There is an increase in the complexity of in-vehicle tasks and displays that can result in driver information overload. Systems are becoming more integrated and are presenting more data to the driver. Information overload could result in high cognitive workload, reducing situational awareness and countering the efficacy of ADAS. Systems must therefore be easy to use and should make use of the most appropriate modalities (visual, manual, tactile, sound, haptic, etc.) and be designed to encourage driver adoption. Development teams must establish a clear specification of the driver-vehicle interface early on in development to ensure user and system requirements are aligned.
     
  9. Environmental factors affect ADAS. ADAS systems must function under a variety of weather and lighting conditions. Ideally, vision-based systems should be smart enough to understand when they are operating in poor visibility scenarios such as heavy fog or snow, or when direct sunlight shines into the lens. If the system detects that the lens is occluded or that the lighting conditions are unfavorable, it can disable itself and warn the driver that it is non-operational. Another example is an ultrasonic parking sensor that becomes prone to false positives when encrusted with mud. Combining the results of different sensors or different sensor technologies (sensor fusion) can often provide a more effective solution than using a single technology in isolation.
     
  10. Testing and validating is an enormous undertaking. Arguably, testing and validation is the most challenging aspect of ADAS development, especially when it comes to vision systems. Prior to deploying a commercial vision system, an ADAS development team must amass hundreds if not thousands of hours of video clips in a regression test database, in an effort to test all scenarios. The ultimate goal is to achieve 100% accuracy and zero false positives under all possible conditions: traffic, weather, number of obstacles or pedestrians in the scene, etc. But how can the team be sure that the test database comprises all test cases? The reality is that they cannot — which is why suppliers spend years testing and validating systems, and performing extensive real-world field-trials in various geographies, prior to commercial deployment.
     
There are many hurdles to bringing ADAS to mainstream vehicles, but clearly, they are surmountable. ADAS systems are commercially available today, consumer demand is high, and the path towards widespread adoption is paved. If consumer acceptance of ADAS provides any indication of societal acceptance of autonomous drive, we’re well on our way.

Squeezing into a tight spot

Paul Leroux
No doubt about it, autonomous and semi-autonomous cars will present a variety of legal and ethical challenges. But they'll also offer many benefits — some of which will be pleasantly surprising.

Take parking, for example. Cars are getting wider, but parking spaces generally aren't. So how do you squeeze into a tight spot and then step out of your car without slamming your door into the car next to you? Well, what if you didn't have to be in the car? This new video from Ford tells all...



This technology is cool, especially for aging drivers who can't crane their necks as well as they used to. Still, some gotchas come to mind. For instance, other drivers might get peeved if you momentarily leave your car on the road so you can park it remotely. Also, what if you squeeze your car into a tight parking spot just inches away from driver's door of the adjacent car — but that car doesn't support remote-controlled parking? How will the driver get back into his or her vehicle?

That said, these problems can be avoided with a little common sense on the part of the user. And I'll bet you dimes to donuts that this new technology from Ford can negotiate parking spaces more adroitly than most motorists. Which means that, eventually, we'll all have vehicles with fewer bumps, scuffs, and scratches. I could live with that.

A sound approach to creating a quieter ride

Tina Jeffrey
Add sound to reduce noise levels inside the car. Yup, you read that right. And while it may seem counterintuitive, it’s precisely what automakers are doing to provide a better in-car experience. Let’s be clear: I’m not talking about playing a video of SpongeBob SquarePants on the rear-seat entertainment system to keep noisy kids quiet — although I can personally attest to the effectiveness of this method. Rather, I’m referring to deliberately synthesized sound played over a vehicle’s car speakers to cancel unwanted low-frequency engine tones in the passenger compartment, yielding a quieter and more pleasant ride.

So why is this even needed? It comes down to fuel economy. Automakers are continually looking at ways to reduce fuel consumption through techniques such as variable cylinder management (reducing the number of cylinders in operation under light engine load) and operating the engine at lower RPM. Some automakers are even cutting back on passive damping materials to decrease vehicle weight. These approaches do indeed reduce consumption, but they also result in more engine noise permeating the vehicle cabin, creating a noisier ride for occupants. To address the problem, noise vibration and harshness engineers (OEM engineers responsible for characterizing and improving sound quality in vehicles) are using innovative sound technologies such as active noise control (ANC).

Automotive ANC technology is analogous to the technology used in noise-cancelling headphones but is more difficult to implement, as developers must optimize the system based on the unique acoustic characteristics of the cabin interior. An ANC system must be able to function alongside a variety of other audio processing tasks such as audio playback, voice recognition, and hands-free communication.


The QNX Acoustics for Active Noise Control solution uses realtime engine data and sampled microphone data from the cabin to construct the “anti-noise” signal played over the car speakers.

So how does ANC work?
According to the principle of superposition, sound waves will travel and reflect off glass, the dash, and other surfaces inside the car; interfere with each other; and yield a resultant wave of greater or lower amplitude to the original wave. The result varies according to where in the passenger compartment the signal is measured. At some locations, the waves will “add” (constructive interference); at other locations, the waves will “subtract” or cancel each other (destructive interference). Systems must be tuned and calibrated to ensure optimal performance at driver and passenger listening positions (aka “sweet spots”).

To reduce offending low-frequency engine tones (typically <150 Hz), an ANC system typically requires real-time engine data (including RPM) in addition to signals from the cabin microphones. The ANC system then synthesizes and emits “anti-noise” signals that are directly proportional but inverted to the original offending engine tones, via the car’s speakers. The net effect is a reduction of the offending tones.


According to the superposition principle of sound waves, a noise signal and an anti-noise signal will cancel each other if the signals are 180 degrees out of phase. Image adapted from Wikipedia.

Achieving optimal performance for these in-vehicle systems is complex, and here’s why. First off, there are multiple sources of sound inside a car — some desirable and some not. These include the infotainment system, conversation between vehicle occupants, the engine, road, wind, and structural vibrations from air intake valves or the exhaust. Also, every car interior has unique acoustic characteristics. The location and position of seats; the position, number, and type of speakers and microphones; and the materials used inside the cabin all play a role in how an ANC system performs.

To be truly effective, an ANC solution must adapt quickly to changes in vehicle cabin acoustics that result from changes in acceleration and deceleration, windows opening and closing, changes in passenger seat positions, and temperature changes. The solution must also be robust; it shouldn’t become unstable or degrade the audio quality inside the cabin should, for example, a microphone stop working.

The solution for every vehicle model must be calibrated and tuned to achieve optimal performance. Besides the vehicle model, engine noise characteristics, and number and arrangement of speakers and microphones, the embedded platform being used also plays a role when tuning the system. System tuning can, with conventional solutions, take months to reach optimal performance levels. Consequently, solutions that ease and accelerate the tuning process, and that integrate seamlessly into a customer’s application, are highly desirable.

Automotive ANC solutions — then and now
Most existing ANC systems for engine noise require a dedicated hardware control module. But automakers are beginning to realize that it’s more cost effective to integrate ANC into existing vehicle hardware systems, such as the infotainment head unit. This level of integration facilitates cooperation between different audio processing tasks, such as managing a hands-free call and reducing noise in the cabin.

Earlier today, QNX announced the availability of a brand new software product that targets ANC for engine tone reduction in passenger vehicles. It’s a flexible, software-based solution that can be ported to floating or fixed-point DSPs or application processors, including ARM, SHARC, and x86, and it supports systems with or without an OS. A host application that executes on the vehicle’s head unit or audio amplifier manages ANC through the library’s API calls. As a result, the host application can fully integrate ANC functionality with its other audio tasks and control the entire acoustic processing chain.

Eliminating BOM costs
The upshot is that the QNX ANC solution can match or supersede the performance of a dedicated hardware module — and we have the benchmarks to show it. Let me leave you with some of the highlights of the QNX Acoustics for Active Noise Control solution:

  • Significantly better performance than dedicated hardware solutions — The QNX solution can provide up to 9dB of reduction at the driver’s head position compared to 5dB for a comparative hardware solution in the same vehicle under the same conditions.
     
  • Significant BOM cost savings — Eliminates the cost of a dedicated hardware module.
     
  • Flexible and configurable — Can be integrated into the application processor or DSP of an existing infotainment system or audio amplifier, and can run on systems with or without an OS, giving automakers implementation choices. Also supports up to 6 microphone and 6 speaker-channel configurations.
     
  • Faster time to market — Speeds development by shortening tuning efforts from many months to weeks. Also, a specialized team of QNX acoustic engineers can provide software support, consulting, calibration, and system tuning.

For the full skinny on QNX Acoustics for Active Noise Control, visit the QNX website.

Distracted driving — the stats are alarming

I was driving to work the other day when I heard something on the radio that almost made me drop my smartphone. The Ontario Provincial Police (OPP) announced that, for the first time, deaths attributable to driver distraction outnumber those caused by impaired driving. So far this year, on roads patrolled by the OPP, distraction has led to 47 deaths, while impaired driving has led to 32.

This stat drives home the need for dramatically better head-unit integration of services that drivers would otherwise use their phones to access. This isn't anything new to QNX. We've been working with our partners to provide all the necessary elements to enable this integration through technologies such as HTML5, Qt, iPod out, MirrorLink, and Bluetooth. All these technologies can help create systems that minimize driver distraction but they represent only part of the solution. Pushing buttons on your head unit, combined with smart HMI design, does help, but it's not a panacea.

To truly help drivers keep their eyes on the road we have to minimize the time they spend looking at the infotainment display. Multi-modal HMIs built from the ground up with the assumption that high-quality speech recognition and text-to-speech are available will drastically change the way drivers interact with their infotainment systems. For instance, such HMIs could read your texts and emails aloud to you; they could even let you dictate responses at the appropriate time. But really, the possibilities are endless. And on the topic of talking to your car, we're constantly working with our partners to enrich the speech capabilities of the QNX CAR Platform. But more on that in an upcoming post.

By the way, I wasn't really using my smartphone while I was driving. That's illegal here. Not to mention incredibly dumb.

Seminar: managing the growing amount of software in cars

It’s no secret that the amount of software in automobiles is growing rapidly — as is the challenge of maintaining it reliably and efficiently. At QNX Software Systems we focus on areas like infotainment, telematics, clusters, and ADAS, but our long-term FOTA partner, Red Bend Software, takes a more holistic view, working with companies like Vector Informatik to extend FOTA all the way down to ECUs.

To help automakers and tier one suppliers manage their software deployments more efficiently, Red Bend is hosting a seminar Friday September 27 at the Westin Southfield Detroit. Speakers will include representatives from Strategy Analytics, Texas Instruments, and Vector, not to mention our own Andy Gryc. You can register on the Red Bend website.

New Mercedes-Benz Concept S-Class Coupé sports QNX-powered infotainment system

Paul Leroux
All-digital instrument cluster and head unit based on QNX CAR Platform

Did you ever lay your eyes on something and say, “Now, that is what I want for Christmas”? Well, I just said it — in response to a set of wheels. But holy turbochargers, what wheels! Not to mention everything else.

If you’re wondering what fueled this sudden rush of automotive desire, here’s a glimpse:



And here’s a bird’s-eye view:



And here’s a peak at the oh-so-gorgeous interior:


All images copyright Daimler AG

Mercedes-Benz took the wraps off this car, the new Concept S-Class Coupé, earlier this week. And just a few minutes ago, QNX and Mercedes revealed that the car’s infotainment system is based on the QNX CAR Platform.

This isn’t the first time QNX and Mercedes-Benz have worked together. Besides providing the OS for various Mercedes infotainment systems, the QNX automotive team has worked with Mercedes-Benz Research & Development North America since the early 2000s, providing the group with advanced technologies for the specification and prototyping of next-generation vehicle electronics. The infotainment system in the Concept S-Class Coupé is the latest — and arguably coolest — product of this long collaboration.

The Concept S-Class Coupé also packs a serious power plant: a 449 hp Biturbo V8 with peak torque of 516 lb-ft. And it offers driver-assistance technologies that are, quite literally, forward looking. Here is a sampling of what's inside:

  • Two 12.3-inch displays
  • Touchscreen showing four world clocks
  • Stereo camera offering 3D view of the area in front of the car
  • "6DVision" to detect the position and movement of objects in front of the car
  • Variety of assistance systems to monitor surrounding traffic

I’m only touching the surface here. For more details on the car, visit the Mercedes-Benz website. And before you go, check out the press release that QNX issued this morning.


Why doesn’t my navigation system understand me?

A story where big is good, but small is even better.

Yoshiki Chubachi
My wife and I are about to go shopping in a nearby town. So I get into my car, turn the key, and set the destination from POIs on the navigation system. The route calculation starts and gives me today’s route. But somehow, I feel a sense of doubt every time this route comes up on the system...

Route calculation in navigation uses Dijkstra's algorithm, invented by Edsger Dijkstra in 1956 to determine the shortest path in a graph. To save calculation time, navigation systems use two directional searches: one as the starting point and the other as the destination point. The data scheme that navigation systems use to represent maps consists of nodes, links, and attributes. Typically, a node represents a street intersection; a link represents the stretch of road, or connection, between two nodes; and attributes consist of properties such as street name, street addresses, and speed limit (see diagram).

Features of a map database. From Wikipedia.
As you may guess, it can take a long time to calculate the shortest path from all of the routes available. The problem is, automakers typically impose stringent requirements on timing. For example, I know of an automaker that expected the route from Hokkaido (in northern Japan) to Kyushu (in southern Japan) to be calculated in just a few seconds.

To address this issue, a system can use a variety of approaches. For instance, it can store map data hierarchically, where the highest class consists of major highways. To choose a route between two points, the system follows the hierarchical order, from high to low. Another approach is to use precalculated data, prepared by the navigation supplier. These examples offer only a glimpse of the complexity and magnitude of the problems faced by navigation system vendors.

An encouraging trend
Big data is the hot topic in the navigation world. One source of this data is mobile phones, which provide floating car data (current speed, current location, travel direction, etc.) that can be used by digital instrument clusters and other telematics components. A system that could benefit from such data is VICS (Vehicle Information and Communication System), a traffic-information standard used in Japan and supported by Japanese navigation systems. Currently, VICS broadcasts information updates only every 5 minutes because of the bandwidth limitations of the FM sub-band that it uses. As a result, a navigation system will sometimes indicate that no traffic jam exists, even though digital traffic signs indicate that a jam does indeed exist and that service is limited to the main road. This delay, and the inconvenience it causes, could be addressed with floating car data.


An example of a VICS-enabled system in which traffic congestion, alternate routes, and other information is overlaid on the navigation map. Source: VICS

During the great earthquake disaster in East Japan, Google and automotive OEMs (Honda, Nissan, Toyota) collaborated by using floating car data to provide road availability — a clear demonstration of how can big data can enhance car navigation. Leveraging big data to improve route calculation is an encouraging trend.

Small data: making it personal
Still, a lot can be accomplished with small data; specifically, personalization. I may prefer one route on the weekend, but another route on a rainy day, and yet another route on my wife's birthday. To some extent, a self-learning system could realize this personalization by gauging how frequently I've used a route in the past. But I don’t think that's enough. As of now, I feel that my navigation system doesn't understand me as well as Amazon, which at least seems to know which book I’d like to read! Navigation systems need to learn more about who I am, how well I can drive, and what I like.

Personalization resides on the far side of big data but offers more convenience to the driver. The more a navigation system can learn more about a driver (as in “Oh, this guy has limited driving skills and doesn’t like narrow roads”), the better. It is best to store this data on a server; that way, the driver could benefit even if he or she switches to a different car or navigation system. This can be done using the latest web technologies and machine learning. Currently, navigation systems employ a rule-based algorithm, but it would be interesting to investigate probability-based approaches, such as Bayesian networks.

I’m looking forward to the day when my navigation system can provide a route that suits my personal tastes, skills, and habits. Navigation suppliers may be experiencing threats from the mobile world, including Google and Apple, but I think that returning to the original point of navigation — customer satisfaction — can be achieved by experienced navigation developers.

Yoshiki Chubachi is the automotive business development manager for QNX Software Systems in Japan

QNX and the W3C: setting a new standard

For almost two years, you’ve heard us talk about HTML5 in the car, particularly as it applies to the QNX CAR Platform for Infotainment. And now, we're taking the next step: working with the entire automotive community to develop a standard set of JavaScript APIs for accessing vehicle sensor information.

Andy Gryc (that’s me of course) and Adam Abramski (from Intel and representing GENIVI) are co-chairs in the World Wide Web Consortium (W3C) Automotive and Web Platform Business Group. Yes, our group name is a mouthful. But the translation is that Adam and I are working with W3C group members to create a standard that everyone can agree on.

Between GENIVI, Tizen, Webinos, and QNX, four different APIs are in use today. So what’s the process? All of these APIs have been submitted to the W3C group members as contributions. Those contributions form the groundwork, creating a baseline for where we need to go. Collectively as a group, we need to merge these four APIs — figure out the commonalities and harmonize the differences to create a new standard that takes the best features of all the proposals.

This effort takes some time, but the group intends to complete a first draft by December this year. Either Tina Jeffrey (my colleague, who’s doing some of the heavy lifting) or myself will be posting our progress here, so keep an eye out for our updates!

QNX automotive summit in Shanghai: the recap

Guest post from Alan Zhang, technical solutions manager, QNX Software Systems

Alan Zhang
On August 27 the QNX Automotive Summit returned to China, bringing together global automotive leaders in beautiful downtown Shanghai. Despite the morning traffic, by 9:30 a.m. more than 130 delegates from the automotive industry had filled up the Grand Ball Room at Ritz-Carlton, Pudong. The number of delegates exceeded our expectations — our event manager Alison had to ask the hotel for extra chairs!

The theme of the summit was “explore new opportunities in automotive and mobile convergence”. The convergence of the car and the smartphone is becoming a universal topic, but China is a particularly interesting place to discuss this subject: not only is the prevalence of the car relatively new, but the country is already the world’s largest automotive market. Competition is fierce — the leaders gathered at the summit shared their expert insights for winning new and unique automotive opportunities in China.

Mission-critical pedigree: Derek Kuhn
delivers his opening remarks.
The word from Audi, AutoNavi, Foryou, Harman
Derek Kuhn, QNX vice president of sales and marketing, got things rolling with a talk on how our mission-critical pedigree and mobile experience help automotive companies address the challenges of the connected car. Following Derek were Xiaodan Tang of Audi China and Tong Zao of Harman International who shared their views on automotive trends from the OEM and tier one perspectives.

The day before the summit, we hosted a press conference announcing our collaborations with the Chinese companies AutoNavi and Foryou. The press event attracted 37 journalists, all curious to hear about our strategy for China and who in China we are working with (see our recent posts on AutoNavi and Foryou). On the summit day we were honored to have guest speakers from these companies — Yongqi Yang, executive VP of AutoNavi, and Zou Hong, director of product management, Foryou.

Autonomous drive
In China, collaboration with the government and academia is a key topic in the automotive industry. Jin Xu, our global education program manager, and Professor T. John Koo from Shenzhen Institute of Advanced Technologies, Chinese Academy of Sciences (SIAT CAS), shared a session titled “Shaping Future Cars in China: Research and Education.” Professor Koo leads research using QNX software at SIAT CAS and has been involved in autonomous drive research since 2003, long before the word ADAS existed. Jin introduced QNX Software Systems’ academic initiatives in China and how we are enabling future automotive engineers.

Global reach, local services
Deploying services and features that are regionally relevant is a key challenge for global automotive companies. Weiyu Liang, our director of engineering services for APAC, spoke on QNX engineering services and how we support local customers. Localization is hugely important for anybody targeting the China market. Our last guest speaker, Suo fei Li of Baidu, provider of the biggest Chinese language search engine, spoke on how Baidu can work with automotive companies as a trusted partner rather than as just a supplier. A Baidu application running on the QNX CAR Platform was also shown at the event along with the latest features included in version 2.1.

Our hardware partners Altera, Elektrobit, Freescale, NVIDIA, Renesas, TI, Xilinx were also on hand, showcasing their latest automotive demos.

A unique combination
Andrew Poliak, our automotive business development director, delivered the closing presentation. Tying together various discussions that happened throughout the day, Andrew’s speech focused on QNX advantages such as platform flexibility, HMI options, advanced acoustic technology, and our unique ability to combine all of the above with functional safety. This all tied into our event theme — enabling automotive customers and giving them competitive edge to seize the new and unique opportunities in China.



Summit at a glance — a pictorial overview from QNX marketing manager Noko Kataoko

So many people were in the room, the camera couldn't fit them all in. Next year, we'll have to invest in a wider lens: ;-)



Taking QNX for a drive. The exhibit hall featured several QNX automotive partners, including Altera, Elektrobit, Freescale, NVIDIA, Renesas, TI, and Xilinx:



The summit included talks from Audi, AutoNavi, Foryou, Harman, QNX, and the Shenzhen Institute of Advanced Technologies. Speakers included our own Andrew Poliak, who looks like he's discussing the virtues of the QNX logo, but is in fact pointing to his presentation on stage right:



Did I mention there was a draw for a shiny new Nikon camera? Did I mention I didn't win? Did I mention it's because, as a QNX employee, I wasn't allowed to participate? Now don't get me wrong, I'm not bitter, or anything...



Mmm... don't they look good? Besides getting a taste of what's in store for the connected car, attendees got to enjoy some other tastes as well: