Nexttech

Nexttech
Creating Generational Legacies

Wednesday, May 10, 2017

Face-to-face contact improves our health

Courtesy Bill Gross - idea labs 

Susan Pinker gave a great talk showing how different parts of your brain light up with human contact, but only face-to-face contact and interaction engaged the brain that way. 

Passively watching a video didn’t do the same thing.

 She studied a small village in Sardinia where the population has an above-average number of centinarians, male and female, which she attributes in part to the constant/close personal interactions of the villagers.

She showed this graph of staying alive:

Social integration and close relationships topped the list, even above smoking, drinking, exercise, being overweight, and clean air! 

She said that genes account for 25% of the variance in longevity, and lifestyle accounts for 75%. 

She said that women live longer than men because they are more likely to prioritize close, in-person friendships. 

She urged people to “build your village, it’s actually a matter of life and death.”

(Checkout www.bbg.business) 

Monday, April 24, 2017

Airbus wants to test autonomous flying cars sometime this year

airbus-autonomous-flying-car

 

French aerospace giant Airbus wants to have an autonomous car in the air by the end of the year, according to the group’s chief executive, Tom Enders.

Airbus claims the autonomous flying car will alleviate traffic problems in major cities and could reduce infrastructure budgets for city planners, who won’t have to worry about bridges, traffic lights, or concrete roads.

See also: Well, of course, Larry Page invested in two flying car startups

“One hundred years ago, urban transport went underground, now we have the technological wherewithal to go above ground,” said Enders at the DLD tech conference in Munich. “We are in an experimentation phase, we take this development very seriously,”

Airbus formed the Urban Air Mobility division last year, to start work on a prototype flying car. It hopes to test this by the end of the year, but Enders said it would most likely be 2020 before any commuters hop into a flying car.

Flying cars aren’t choppers

The company also plans to build a semi-autonomous flying car, so the whole project isn’t gutted if regulations say a driver must be able to control the vehicle.

Airbus has a new business plan for the flying cars, not at all like its commercial helicopters, which are sold at a price only few can afford. It wants to develop an Uber-like app for the flying cars, where commuters can rent the autonomous vehicle for a single ride.

Interestingly, Uber also wants to have a service for flying cars, using Airbus’ concept.

The flying car reality might only be a few years away, but with the Federal Aviation Administration (FAA) already struggling to regulate the drone market, it might be an intense fight to allow humans inside autonomous flying vehicles.

Friday, April 14, 2017

Thursday, April 13, 2017

Will AI lead to fewer jobs ? Good or bad for business?

 

I’ve been asked many times recently to comment on how the rise of AI will impact the jobs and the economy, particularly in customer service and contact centers. I’ve seen wildly differing forecasts, from the dire predictions of Elon Musk to the optimistic predictions of Accenture. According to Forrester’s recently released ‘The Future of Jobs’ report, robots will take 24.7 million jobs by 2027, but create 14.9 million new jobs in the same period. There is no doubt that AI will impact jobs globally more than any other technology in our lifetime. The key question is “what should we do about it?”

The answers depend on your point of view and whether you’re a government leader, a business leader or a worker thinking about your own future. Should we tax robots, as Bill Gates suggests? Should we adopt universal basic income as Musk suggests? “Ultimately,” said Musk, “I think there will need to be some sort of improved symbiosis with digital superintelligence, but that’s a pretty involved discussion.”

There are huge societal questions that I won’t attempt to answer here. Instead, I tend to approach the topic of AI and jobs in the same way that I approach the question “how do you eat an elephant?” (Answer: one bite at a time). There are several near-term challenges and opportunities for businesses, and the best thing that business leaders can do is understand what those are.

While some see a bleak future, I see a future where AI and machine learning will create new categories of work, and amplify human intelligence. Computers bring incredible processing power and memory, and can mine vast amounts of information in a short period of time, while humans bring the emotional intelligence and problem-solving skills to handle unexpected or uncommon situations. In the next few years, I see AI becoming integral to the productivity of the workforce. 

Understand and embrace the changes

As leaders think about how AI will impact their businesses in the next few years, there are several key questions they should consider:

  1. How can AI (specifically chatbots) reduce labor costs and improve customer experience? 
  2. What can businesses do to reduce the risk of automation on the workforce?
  3. What new jobs can be created because of automation?
  4. What are the macro-economic global ramifications of further automation?

One of the most obvious areas that AI will impact jobs in the next few years is in customer service and sales, especially in the contact center. Chatbots have the potential to help businesses significantly cut labor costs, which increases profits, but has a human impact. Improvements in AI have enabled chatbots to create effective automated responses that helps businesses generate sales and boost consumer satisfaction. According to a study by Oracle, nearly 80 percent businesses have already implemented, or are planning to adopt, AI as a customer service solution by 2020. 

According to McKinsey, 29 percent of customer service 36 percent of sales representative positions in the US could be automated through chatbots and other tech. BI Intelligence estimates that equates to savings of $23 billion annually in customer service salaries, and $15 billion annually in sales salaries. 

Those are compelling numbers, and it’s clear why so many companies are exploring this. Because of advances in AI, businesses can use artificially intelligent chatbots as virtual agents that replicate the effectiveness of their best human agents. This has the potential to reduce customer frustration and wait times. 

However, it is essential to remember chatbots are still an outward facing extension of the brand, and even though they are not human, consumer expectations around their performance will be high. Moreover, a robot does not have the empathy to handle a frustrated customer, or the creativity to drum up a solution to a particularly unique issue. These uniquely human capabilities shouldn’t be underestimated – they’re essential to the workforce of the future, particularly the customer experience of the future. And if companies are incentivized to invest in the platform development and training to empower humans and machines to work together, automation can be less of a risk, and more an opportunity.

What should businesses think about?

  • Which types of jobs are most easily automated and what level of human involvement will be needed after you do so?
  • What kinds of jobs that are possible when a human has access to incredible processing power? Prepare to develop and train your employees for those jobs.
  • How do chatbots differ, and what are the requirements for business?
  • How can we design conversations using AI? Right now the focus is just on the call, and that’s where it ends. How can we re-think the experience across all touchpoints? 
  • How can we use AI to anticipate what the customer needs and do it on their behalf?

Bots have the power to create, not just destroy jobs. In the near future AI and chatbots will free human workers from many repetitive, mundane tasks. This will cost some jobs but it will also create new positions – some not even invented yet. (Think stables and blacksmiths vs. parking garages and mechanics a hundred years ago.) 

Let’s take a contact center today and consider how it might evolve for tomorrow. Today, there’s little distinction between someone designing conversations vs. handling customer queries, but in the near future, many of the routine activity that agents handle will go away. In the next few years, I believe that 80 percent of contact center operations will be automated. The other 20 percent will be highly paid customer service jobs, including agents with the capability to train machines to become smarter. The agent of future will be more educated, more sophisticated and apply principles of psychology to handle high-value, complex conversations with customers

This will have a greater impact on countries such as Colombia, Guatemala, India and the Philippines, which have a much higher population of contact center agents than the United States. I envision something similar to what happened in the 90s when all the maintenance work started moving there. Over time, those jobs transitioned into actual development, and now many of the largest software companies including Adobe and Microsoft create new products there. 

Automation will affect every industry, but the vital role of humans working behind the veil of AI should not be underestimated. The notion of fully autonomous AI is still a thing of fantasy for now. For the foreseeable future, businesses will need humans to teach machines to work smarter, and bridge the gap where AI falls short – particularly when it comes to the complexities of human emotion.  Human labor remains a key component of the AI loop, and as we’ve seen with just about every other major technological advancement, some jobs will be lost but many more will be created to fit this new reality. 

Self Driving trucks - the pleasure and the pain

 
Self-driving trucks are no longer the future. They are the present. They are here! 

There will be a reduction of accidents of the road. In 2012 in the US, 330,000 large trucks were involved in crashes that killed nearly 4,000 people, most of them in passenger cars. About 90 percent of those were caused by driver error.  Robot trucks will kill far fewer people, if any, because machines don’t get tired or distracted. Machines don’t look at phones instead of the road. Machines don’t drink alcohol or do any kind of drugs or involve any number of things that somehow contribute to the total number of accidents every year involving trucks.
 
Robot trucks also don’t need salaries. No more need for health insurance either. Self-driving trucks will also never need to stop to rest, for any reason. Routes will take less time to complete.

BUT!!!!!

What does this mean for the 3.5 million truck drivers in America? Will they have jobs ? What will this mean to the local economies dependent on truckers and he othe 5.2 million people employed in the industry. (Insurance , Restaurants, Diners , Motels etc)

Add the development of the hypeloop being able to deliver freight from NY to LA in under 4 hours, and you have an interesting cocktail of cataclysmic unemployment and decimation of industries and in turn small towns dependent on their custom. . 
 
 

Will these small rural towns close and people move to the cities and find jobs there?

One further important detail to consider is that truck drivers are well-paid. They provide a middle class income of about $40,000 per year. That’s higher than 46% of all tax filers. Truck driving is just about the last job in the country to provide a solid middle class salary without requiring a post-secondary degree.
 
If we look at the big national picture, we are potentially looking at well over 10 million American workers and their families whose incomes depend entirely or at least partially on the incomes of truck drivers, who are, by the way, extremely well paid and a major contributor to the Tax System! 
 
The replacement of truckers is inevitable. It is not a matter of “if”, it’s only a matter of “when.”  No one should be asking what we’re going to do when computers take our jobs.  We should all be asking what we get to do once freed from them. 
 
Partial source Huffington Post and Bob Pritchard 

AI

 

Australian startup Immersive Robotics are poised to deliver what they claim is a truly universal wireless solution for PC VR headsets that delivers tether-free virtual reality with minimal compromises in quality and extremely low latency.

I’ve always found it fascinating to observe how the advent a new technology can accelerate the development of another. The push for rapid advances in smartphone specifications for example accelerated the development of mobile, high resolution displays and low-cost IMU components without which today’s first generation consumer VR headsets could simply not have existed. Likewise, now that consumer VR is finally here, the demand for a solution to those ever more anachronistic, presence-sapping cables is driving innovation and rapid advancement in wireless video systems.

We’ve seen an explosion of stories centering around companies looking to take the (until now) slowly evolving sphere of wireless video broadcasting and give it a good shot in the arm. Most recently we’ve seen HTC partner with TPCast to deliver a wireless add-on solution for their SteamVR powered Vive VR system. But prior to that we’d already heard how Valve was investing a “significant amount” in wireless video streaming for VR by way of Nitero, a specialist in the field with Quark VR and Serious Simulations on the scene still earlier than that. However, when it comes to pushing the boundaries of cutting edge technology, you can never have too many people racing to the finish line.

Immersive Robotics (IMR) are an Australian startup who have developed a wireless VR streaming and inline compression system designed from the very start to be used with PC VR headsets, offering a claimed sub 2-3ms latency, with minimal compromises to image quality and works over existing WiFi standards you’re very likely to have in your home right now. IMR call their system the Mach-2K and from what they’ve we’ve seen so far, it shows some considerable promise. In truth, IMR’s project is far from new as the founders have been developing their technology since 2015, with working proof of concept running first on an early OSVR headset before securing a government grant to fund further development.

IMR was co-founded by Tim Lucas and Dr Daniel Fitzgerald. Lucas has a background in unmanned vehicle design having worked on multiple “prominent” UAV designs but has also worked with VR and LiDAR powered Photogrammetry, having built what he describes as “the first Virtual Reality simulation of a 3D scanned environment from an aircraft”. Lucas’ co-founder Fitzgerald hails from aerospace avionics engineering with a PhD focusing on the then emerging unmanned drone industry. Fitzgerald has built auto-piloting software for said drones, an occupation which let him practice his talent for algorithm software development.

With the virtual reality industry now growing rapidly, the duo have set about designing a system built around proprietary software algorithms that delivers imagery to VR headsets wirelessly. “Basically from an early point in modern VR history, my business partner Dr Daniel Fitzgerald and I decided to tackle the problem of making a HMD wireless,” Fitzgerald tells us, “Our original area of expertise was in designing high-end drones and we initially envisioned it as an interface for that area.” The team quickly realised that with the advent of consumer level cost, room-scale VR, there were some significant opportunities to capitalise. “Soon after looking into it, we realized that logically pretty soon everyone using tethered HMD’s would probably just want to get rid of the wires anyway and that the potential in this growing market was significant,” Lucas tells us, “We designed a video compression algorithm from the ground up that could compress data down to acceptable rates for current wireless technology but at the same time eliminating the flaws of current compression technology that make it unsuitable for VR such as high added latency.”

“What we ended up with was a compression and decompression algorithm running on individual boards, which is able to plug into HTC Vive compress it’s data down by around 95% with less than 1ms additional latency. Most of all there is no visible degradation to what the user normally sees with the cables.”

http://www.roadtovr.com/imr-building-wireless-video-system-power-4k-per-eye-vr-headsets/