Having created the first mind-controlled Bionic prosthetic leg, the head of Icelandic firm Össur discusses the promise — and challenges — of brain research.
One of the most exciting frontiers of exploration in science isn’t some faraway galaxy — it’s in our heads. The global race to unlock the mysteries of how the brain functions holds both great promise and potential peril, sparking debates over how far artificial intelligence and cyborg technologies can — and should — could take humanity.
Much of the discussion is based on hypotheticals, given how much there still is to learn about the brain. But the pace of breakthroughs is poised to accelerate, with research initiatives being launched in the U.S., Europe and elsewhere.
One recent breakthrough comes from Iceland, where Össur has created the world’s first “mind-controlled Bionic prosthetic legs”— a prosthetics that detect electrical impulses from the brain. Jon Sigurdsson, president & CEO of Össur, says his hope is for prosthetics to replace the full functionality of the lost limb by responding to both brain signals and the external environment.
As a pioneer in commercializing brain research, Sigurdsson sees a certain responsibility to ensure ethical considerations are “at the forefront” of any innovations. “In neurologic applications — beyond mitigating risk of physical damage to the brain — companies need to demonstrate they would help preserve and not exploit patients’ mental and emotional well-being and their whole `thought-life,’ such as their memories and dreams, as well,” he says.
In the interview, Sigurdsson discusses his hopes for brain research, the challenges of commercializing innovations, as well as the potential ramifications — good and bad — of artificial intelligence.
Beyond Bionic limbs, what do you see as the most exciting potential breakthrough that could come out of brain research?
Relatively recent developments in neurogenesis and neuroplasticity suggest that human brains may grow new neurons during their lifetime, and that existing neuronal pathways may be reorganized to influence specific bodily functions. This is a very different perspective than in the past, when researchers proposed that neural pathways might “die off” from lack of use.
I’m intrigued in how these concepts might apply after a person has a debilitating injury or amputation — how might neuronal pathways be channeled or redirected to enhance a person’s mobility?
You’ve described your own research into artificial intelligence as a “very humbling experience,” given how far we are from replicating how the mind and body function. What do you see as the greatest obstacles to achieving true AI?
Right now, our bionic limbs “learn” in real-time — they gather data and adjust according to a wearer’s pace, speed, gait and the texture of the terrain they are covering.
AI has been described as having three levels or categories. Algorithm-based is the first level, or Artificial Narrow Intelligence (ANI). While ANI is a tremendous technological achievement, it still pales in comparison to the human brain. This is what’s humbling — understanding how extraordinarily complex our brain is, how we utilize its capacity and capability for routine activities and mundane thoughts, to say nothing of when we apply our brains to innovative challenges, such as exploring our innate creativity or engaging in problem-solving.
I believe the second next level of AI, or Artificial General Intelligence (AGI), is what most AI development is pursuing today and what I consider to be closer to everyday human thought.
What are the biggest hurdles to bringing neuroscience-related products to the market?
Because of the brain’s complexity, absolute value to the human existence, and its relative fragility — it’s difficult to recover from brain injury – it’s understandable that regulatory agencies are taking a conservative approach to commercializing neuroscience-related products.
We need to clearly understand how technological intervention might affect brain functionality, both initially and over longer terms. We must be confident that potential adverse events — whether from bioengineering or device applications, or biochemical ones from pharmaceuticals — can be mitigated if not avoided outright.
I would expect that, as research continues progressing, we’ll be better equipped to understand and predict the effects of neuro-related products, and this should help accelerate commercialization.
Do you share the concern of Stephen Hawking and others about the potential threat of AI to humanity, and are there any policies or other steps that should be taken to ensure we maintain control over such technologies?
The highest form of AI – Artificial Super Intelligence (ASI) – involves the possibility of technology developing superior capabilities to human thinking and introduces ideas of whether technology might ever become dangerous to humans.
Despite the overall acceleration of technological advancement, I believe we are quite a ways from developing true ASI capabilities, and any projected problems of ASI seem unlikely during our lifetimes. However, precautionary measures are warranted, in the event ASI-type functions became reality, to prevent any negative impact on future generations. We need to consider how and when we rely on AI for critical decision-making.
On the whole, though, I’m not convinced ASI will actually happen as predicted. I return to my original hypothesis: the human body, including its infinitely beautiful and complex brain, remains the ideal standard against which all artificial technologies will be measured.
When GE acquired Alstom’s power and grid business last fall, it bought some of world’s largest wind and water turbines, “ultra-super-critical” steam turbines – that’s their real name – massive generators and other advanced technology for making lots of electricity. So much so that with Alstom, GE technology can now generate one third of the world’s power. That share will grow as utilities start connecting power plants to the cloud and use data analytics to optimize them and make them more efficient. For the first time here, we are taking a look at some of the most brass-kicking Alstom machines.
Image may be NSFW. Clik here to view.
Hubs of Alstom’s massive Haliade turbines, which can generate 6 megawatts (MW) of electricity, at a factory in Saint-Nazaire, France. They will power America’s first offshore wind farm near Block Island, Rhode Island. Top: A Haliade turbine off the coast of Belgium. Image credits: GE Power
The Nante de Drance pumped storage power station has an output of 900 MW. It uses electric pumps to send water into the reservoir when electricity consumption is low and releases it to generate power at peak times. Image credit: GE Power
The Itaipu Dam on the Parana River holds a row of 20 huge Alstom water turbines like the one below. They supply Brazil with a quarter of its power, and Paraguay with 90 percent of its electricity. In 2008, the turbines generated 94,684 MW, the largest amount of power ever produced by a single dam in a year. Image credit: GE Power
Alstom’s Arabelle is the world’s largest turbine. It converts steam produced by boiling water with heat from nuclear reactors into electricity. It delivers 1,550 MW and it has extremely high reliability (99.96 percent). Image credit: GE Power
Alstom’s massive volute pumps cool off a nuclear power plant’s secondary loop. The pumps circulate cold water into the condenser. Image credit: GE Power
Alstom’s “ultra-super-critical” steam turbine at the Boxberg power plant in Germany can produce 600 MW. The technology allows power plants to operate at pressures and temperatures above the critical point where there is no difference between water and steam. It allows utilities to build more efficient power plants with fewer emissions. Image credit: GE Power
An otherworldly Alstom disconnector in Noventa Di Piave, Italy. It can stop a 1,200,000-volt current. That’s about equal to a tenth of the electricity carried by a lightning bolt. The disconnector is an isolator switch used to make sure that an electrical circuit is completely de-energized for service and maintenance. Image credit: GE Power
This high voltage substation in France is using sulfur hexafluoride gas for insulation. The gas is an excellent electrical insulator and minimizes losses during operations. Image credit: GE Power
The Empire doesn’t strike back here. These aren’s baby AT-ATs but FACTS – Flexible Alternating Current Transmission Systems. They allow utilities to transmit alternating current (AC) over long distances without suffering from natural voltage drop along the way. Image credit: GE Power
Emily Whitehead was just 5 years old when she fell ill with acute lymphoblastic leukemia, the most common type of childhood cancer. Most kids beat the disease within two years, but Emily relapsed twice after several rounds of chemotherapy.
Desperate to find a treatment, her parents enrolled her in an experimental trial at The Children’s Hospital of Philadelphia (CHOP) in early 2012. CHOP’s doctors drew her blood and collected some of Emily’s T lymphocytes — white blood cells that form the backbone of the human immune system. The team then reprogrammed the cells by altering their DNA so they would recognize and kill Emily’s cancer. Doctors injected them back into Emily’s bloodstream, where the cells multiplied and wiped out the disease.
Emily has remained cancer-free since. She’s become the poster child for a new type of personalized medicine called cellular immunotherapy, a type of regenerative medicine that seeks to fix faulty cells and tissues by reengineering their DNA and restoring their normal function. “We are on the cusp of a revolution in medicine,” says Phil Vanek, general manager for cell therapy technologies at GE Healthcare. “These are living drugs. The reprogrammed cell itself becomes the treatment that gets reintroduced to the patient. Ordinary drugs wear off after we take them, but regenerative medicine can have a durable effect. It can restore a function that was lost or, someday, give our cells one they never had.”
Top image: GE’s Xuri cell expansion system used for growing cells. Image credit: GE Healthcare Life Sciences Above: T-cells attacking a cancer cell. Image credit: Getty Images
GE helped design the bioreactor CHOP doctors used to grow Emily’s cancer-trained T cells and has a number of products and services used in cell therapy manufacturing today. However, the company doesn’t want to innovate in a vacuum. Vanek believes the more collaboration he and his team can encourage in the industry, the faster these therapies get to the market. That’s why GE and the Canadian government will each invest CA$20 million in a new cell therapy research and process development hub in Toronto led by Canada’s Centre for Commercialization of Regenerative Medicine (CCRM). Canadian Prime Minister Justin Trudeau announced the news today.
Canadian Prime Minister Justin Trudeau (leaning) during a tour of CCRM’s labs today. CCRM President and CEO Michael May is standing on the right. Image credit: GE Reports
The new center will bring together researchers, universities, drug manufacturers and technology companies like GE to speed up the development and production of cell therapies and make them available to patients more quickly. “Cell therapy has the potential to cure everything from cancer to diabetes,” Vanek says. “But we need to make it affordable and scalable. In this era of increasing scrutiny of medical costs and reimbursement, we have to help our customers achieve a reasonable cost of goods.”
In many ways, Toronto is the perfect place for the new center. In 1961, James Till and Ernest McCulloch discovered stem cells here while working at the Ontario Cancer Institute and the University of Toronto. Stem cells will likely be key to regenerative medicine. They have the ability to turn themselves into many other different cells in the body, like blood, muscle and skin cells, and potentially repair damaged organs.
Today the city boasts a rich research and bioengineering ecosystem. Since CCRM opened in 2011, the not-for-profit has brought together 50 partners, including academic teams, pharmaceutical and technology companies, investors and regulators. The new center will be located inside Toronto’s 1-million-square-foot MaRS Discovery District, sharing the space with Facebook, Johnson & Johnson’s JLabs, Autodesk and venture investors. The main University of Toronto campus, a cluster of hospitals and the Ontario legislative assembly are right next door. “We need to integrate existing technology to break through manufacturing bottlenecks and make cell therapy affordable and available,” says CCRM President and CEO Michael May. “This rich environment will help us validate new clinical technologies and turn them into lucrative products.”
The new center will be located inside Toronto’s 1-million-square-foot MaRS Discovery District. Image credit: GE Reports
GE, one of CCRM’s founding partners, will play an anchor role in the new hub. The company has more than three decades of experience making biologics, the fastest-growing class of drugs, which includes best-sellers like Herceptin, Remicade, Humira and Avastin. Biologics are proteins grown by genetically modified cells cultured in tanks called bioreactors. GE makes bioreactors as well as other technology required to produce these drugs. “In the biopharmaceutical industry, we use cells in bioreactors to make these therapeutic proteins,” GE’s Vanek says. “We can use similar technology to multiply reengineered cells for cell therapy. Just like cells producing proteins, cells for therapy have to be washed, multiplied, purified, concentrated and packaged. We already know how to do that. But unlike bioprocessing, where the cells are discarded and only the protein is saved, we collect intact living cells and this introduces added complexity to the process.”
Vanek has other tools in his kit. In many ways, GE Healthcare’s cell therapy business is the perfect example of what the company calls the GE Store — the ability to share knowledge across many domains. The team works with GE’s bioprocess business in Uppsala, the HyClone team in Cardiff, software engineers in Bangalore and San Ramone, scientists at GE Global Research and GE Ventures. They will be also using Predix, the cloud-based software platform for the Industrial Internet developed by GE Digital. “Our fate is determined by these collaborations,” Vanek says.
For example, cellular immunotherapy starts with the patient’s own cells. Vanek says Predix could allow manufacturers to keep a digital chain of custody of those cells. “We call it the digital thread and already use it in manufacturing,” he says. “It could allow us to track cells through the whole reprogramming and manufacturing process. We could run real-time analytics during the process and then ultimately follow the patient results. Since every patient introduces a level of biological variability, we envision smart systems in the future that could adapt the production process in real-time depending on the biology that’s coming in.”
Last year GE launched the GE Health Cloud, which allows hospitals to collect and work with patient data. Vanek says that, to be broadly adopted, cell therapy manufacturing processes will have to be flexible, push-button, sterile and disposable.
The new center will help, says CCRM’s May. He believes that better-designed cell therapy tool kits for clinical production could be available within a few years, and “manufacturing blueprints” by the end of the decade. Says May: “The new investment will help us break through the manufacturing bottlenecks and get us there.”
GE will open three advanced technology labs at its global research headquarters in Niskayuna, New York. One lab will focus on artificial intelligence and robotics. The other two will lead the development of a new product management science seeking to deliver higher returns on commercial technology. The company plans to hire some 100 new employees with PhDs and advanced degrees to staff them.
GE has already started using AI, robotics and software to make things. Technologies and systems like the digital thread and the brilliant factory allow the company to connect plants to Predix, its cloud-based software platform. They can track parts and products from the supply chain to production and during operations, learn from the process, improve it and lower manufacturing costs.
GE Global Research is already experimenting will collaborative robots like Baxter. Image credit: Rethink Robotics
Scientists at the two product management labs, the Performance and Cost Lab and the Science of Product Leadership Lab, will work on new ways to bring products to market in a more cost-effective way and with higher returns on investment. The Edge Lab will focus on AI and robotics. “Our transformation into a digital industrial company requires our workplace and employee skill sets to transform too,” said Vic Abate, GE chief technology officer and head of GE Global Research. “We see product management as the next frontier in science. Product management is not new, but it hasn’t been practiced as a science. This is a wholly new approach to it, similar to what Galileo did for astronomy or Sir Isaac Newton for physics.”
Abate said the world was brimming with innovation. “The key difference in today’s ultra-competitive environment is the ability to move technology into the marketplace faster and with the greatest benefit to the company and its customers,” he said.
The first GE research lab opened in nearby Schenectady, New York, in 1900. Three people worked inside the wooden structure before it burned down a year later.
It was an inauspicious beginning for one of the largest corporate research institutions in the world. GE Global Research now employs 3,000 people and runs nine labs in the United States, Brazil, China, Germany, India and Israel. Over the years, the labs have employed several Nobel laureates and developed breakthrough technologies like LEDs, brain MRI and new ceramic composite materials called CMCs for the next-generation of jet engines.
The scientists share their insights with an army of 47,000 engineers working inside GE businesses: from healthcare to oil and gas and aviation. The real payoff comes when they can use the same technology to build a better jet engine as well as to improve on a gas turbine. GE calls the approach the GE Store.
Every day is a bad day for flying if you hang out with Brian DeBruin. DeBruin runs GE Aviation’s jet engine test operations site in Peebles, Ohio, and his job is to make sure that GE engines keep working when they fly into an hailstorm, encounter a dust cloud or ingest a goose. He and his team even set off small explosions inside jet engines to simulate blade failure. “Some of these tests are relatively benign, but others are quite damaging,” DeBruin told GE Reports. “You’ve got to prove that your engines are good.” We recently sent photographer Chris New to check Peebles out.
Top image: A cowl for a GEnx engine stretches out like a giant mechanical moth. Above: The Death Star-like turbulence control structure (TCS) streamlines the air flowing inside a jet engine during testing.
Jet engine test cells have 20-foot-thick walls built from a special high-density concrete. The construction team vibrated the wet concrete down to squeeze out air and eliminate weak spots.
An engineer stands by an air inlet into a test cell. Engines like the GE90-115B, the world’s largest and most powerful jet engine, can swallow as much as 8,000 pounds of air per second.
“When the engine runs and it’s not moving, it’s kind of like a giant vacuum cleaner,” says “cell owner” Ray Staresina. “A large engine like the GE90 will pull a little tornado from the walls if the airflow doesn’t cut it off. When that happens, it distorts the data.”
An inside view of a GEnx jet engine. These blades are controlling airflow around the core of the jet engine. This air generates most of the engine’s thrust.
GE is the company whose jet engines have fan blades made from composite materials. Composites allow engineers to build larger and more efficient jet engines. The blade design was so striking that New York’s Museum of Modern Art included one blade in its collection.
Paying the word’s infrastructure needs is daunting, which is why we need real leadership and innovative thinking.
Unfathomably big numbers can be exhausting. When it comes to infrastructure, there’s significant gap between ambition and reality, as the Overseas Development Institute (ODI) in London has illustrated in a series of infographics. The world is facing unprecedented demand for infrastructure investment through 2030: $57 trillion globally, $16 trillion in Europe, $8.1 trillion in North America, $9 trillion in Asia, $1.8 trillion in Sub-Saharan Africa and so on. These numbers certainly set a stage for discussion, but delivering action on that requirement will take partnership and a lot of hard work from a broad spectrum of public and private participants.
The first and perhaps biggest challenge is to find the means to pay for all of this infrastructure. The current fiscal squeeze happening all over the world means less funding from direct taxation off the balance sheets of governments and more use of alternative funding sources such as user charges, value capture revenues and local hypothecated taxes. All in all, these new types of cashflow are a much riskier proposition for any financier than direct government-sourced funding.
Combine this with a reduction of capacity in commercial lending markets and a lower risk appetite as a result of the global financial crisis, and financing infrastructure has become a lot more complicated in developed markets — let alone developing and low-income countries. The good news is that deals are still being financed, and from an ever broader range of debt providers. Traditional project finance banks are still lending most of the money, but institutional investors like Allianz are emerging fast.
However, it is easy to be fooled. The fact is that the current pipeline of deals is way below what is needed to actually address those really big numbers. For example, the Tappan Zee Bridge Replacement is a mega project in the United States and expected to cost around $4 billion. In order to invest the nearly US$60 trillion in projected global need, we need to develop roughly 15,000 mega projects of equal size to Tappan Zee in the next 15 years — or roughly 1,000 $4 billion projects every year. If investment levels are to increase to achieve the targets that governments need and want, there will not be nearly enough global financing capacity – particularly in the developing markets.
Image may be NSFW. Clik here to view.
Courtesy of ODI.
At the heart of the solution to this problem is the development of a sustainable, long-term financing market. That means bringing in a large pool of institutional debt providers like pension funds, sovereign wealth funds and insurance companies rather than relying solely on the constrained capacity and shorter tenors of the traditional commercial banking market. This is fine in theory, but on most projects there is risk gap that needs to be covered before the institutions will lend.
The good news is that action is being taken to address this problem. Governments, development banks and multilaterals have realised that they cannot simply go it alone and lend directly to projects to alleviate the problem. They do not have the balance sheet capacity for the trillions of dollars of investment required. Therefore the big shift in policy has been the recognition that these institutions need to leverage their capital, cover the risk gap and help bring in private-sector debt capital.
What we are now seeing is the introduction of new credit-enhancement facilities designed to lift the project’s risk profile to an acceptable level for institutional debt — either through guarantees as we’ve seen in the United Kingdom, or first-loss mezzanine capital from institutions like the European Investment Bank and Asian Development Bank. The other positive development is a significant increase in the amount of capital available through new multilateral vehicles like the $100 billion China-led Asian Infrastructure Investment Bank.
These are all excellent initiatives, but it is all happening far too slowly. Mega projects frequently take a decade or more to develop and construct. We have 15 years to deliver tens of thousands of projects. There is a real danger that governments and multilaterals are trying to be too clever with these new credit-enhancement facilities and simply not recognizing the wider picture — that what this is really about is accelerating the development of new infrastructure, which is going to create jobs and foster economic growth. This will in turn alleviate poverty and address social issues.
For the private sector, infrastructure is really just another deal. They will go at their own pace dictated by their own commercial considerations. For governments and their citizens — infrastructure is their livelihood. They are the main beneficiaries both in economic and social terms as they try to manage growing populations, urbanisation and ever-greater demand for basic services.
Governments and communities cannot afford further delays. Governments at all levels need to intervene more and recognize their role in making the market. This may require them to take more risk and pay a bit more in the early years, but it can be justified in terms of the long benefits being generated. Once the market is established governments can step back, but we need leadership and action right now.
GE said this morning it would sell its Appliances business to Haier for $5.4 billion. The company will also enter into a strategic partnership with the Asian white goods manufacturer to work on projects involving the industrial Internet, healthcare and advanced manufacturing.
“We are proud of Appliances’ history and performance,” said Jeff Immelt, GE chairman and CEO. “GE Appliances is performing well and there was significant interest from potential buyers, helping drive a good deal which will benefit our investors, customers and employees.”
GE started making electrical household appliances more than a century ago. The business evolved and grew in tandem with the spread of electricity and the electrical grid, which was pioneered by GE founder Thomas Edison.
A GE ad from home electrical appliances from the 1920s. Image credit: Museum of Innovation and Science Schenectady
For many decades, GE was making both big generators for power plants as well as little motors for dishwashers and washing machines. But GE has been reshaping its portfolio over the last three years. The company has been focusing on core industrial businesses such as GE Aviation and GE Power, investing heavily in software and data analytics, and transforming itself into the world’s largest digital industrial company.
The transaction has been approved by the board of directors of GE and Haier, and remains subject to customary closing conditions, including Haier shareholder approval, and regulatory approvals. The transaction is targeted to close in mid-2016.
Seeking Boston’s “diverse and technologically-fluent” talent, GE said it would move its corporate headquarters to the Massachusetts capital from Connecticut, where it’s been based for four decades. “Today, GE is a $130 billion high-tech global industrial company, one that is leading the digital transformation of industry,” said GE Chairman and CEO Jeff Immelt. “We want to be at the center of an ecosystem that shares our aspirations.”
Greater Boston is home to 55 colleges and universities, including Harvard, Wellesley and the Massachusetts Institute of Technology (MIT), and the state spends more on research and development than any other region in the world. “Boston attracts a diverse, technologically-fluent workforce focused on solving challenges for the world,” Immelt said.
GE has been considering the composition and location of its headquarters for more than three years. The company began its formal review in June 2015 with a list of 40 potential locations. The company said in a press release it picked Boston “after a careful evaluation of the business ecosystem, talent, long-term costs, quality of life for employees, connections with the world and proximity to other important company assets.”
GE is finishing a pivotal year in its transformation into the world’s largest digital industrial company. The company launched GE Digital to help its businesses, customers and partners start moving and analyzing data in the cloud and acquired Alstom’s power and grid business in what was its largest acquisition in history. GE is also well ahead of its plan to sell most GE Capital assets.
Susan Peters, GE board member and vice president for human resources, said “the move to Boston will enable GE to lead in the digital industrial space and better serve our customers.” Said Peters: “The new GE HQ will be even more technology focused with ever-present business intensity surrounded by universities, investors and the business community.”
Pictured above: Inspiration for the architects as we plan for our new collaborative HQ space.
GE said this morning it would sell its Appliances business to Haier for $5.4 billion. The company will also enter into a strategic partnership with the Asian white goods manufacturer to work on projects involving the industrial Internet, healthcare and advanced manufacturing.
“We are proud of Appliances’ history and performance,” said Jeff Immelt, GE chairman and CEO. “GE Appliances is performing well and there was significant interest from potential buyers, helping drive a good deal which will benefit our investors, customers and employees.”
GE started making electrical household appliances more than a century ago. The business evolved and grew in tandem with the spread of electricity and the electrical grid, which was pioneered by GE founder Thomas Edison. For many decades, GE was making both big generators for power plants as well as little motors for dishwashers and washing machines.
But GE is now much more about software applications than home applications for electricity. The company has been focusing on core industrial businesses such as GE Aviation and GE Power, investing heavily in software and data analytics and transforming itself into the world’s largest digital industrial company.
The transaction has been approved by the board of directors of GE and Haier, and remains subject to customary closing conditions, including Haier shareholder approval, and regulatory approvals. The transaction is targeted to close in mid-2016.
As the lines blur between the physical and digital, we need to ensure the technological revolution has a positive impact on society.
We stand on the brink of a technological revolution that will fundamentally alter the way we live, work and relate to one another. In its scale, scope and complexity, the transformation will be unlike anything humankind has experienced before. We do not yet know just how it will unfold, but one thing is clear: the response to it must be integrated and comprehensive, involving all stakeholders of the global polity, from the public and private sectors to academia and civil society.
The First Industrial Revolution used water and steam power to mechanize production. The Second used electric power to create mass production. The Third used electronics and information technology to automate production. Now a Fourth Industrial Revolution is building on the Third, the digital revolution that has been occurring since the middle of the last century. It is characterized by a fusion of technologies that is blurring the lines between the physical, digital and biological spheres.
There are three reasons why today’s transformations represent not merely a prolongation of the Third Industrial Revolution but rather the arrival of a Fourth and distinct one: velocity, scope and systems impact. The speed of current breakthroughs has no historical precedent. When compared with previous industrial revolutions, the Fourth is evolving at an exponential rather than a linear pace. Moreover, it is disrupting almost every industry in every country. And the breadth and depth of these changes herald the transformation of entire systems of production, management and governance.
The possibilities of billions of people connected by mobile devices, with unprecedented processing power, storage capacity and access to knowledge, are unlimited. And these possibilities will be multiplied by emerging technology breakthroughs in fields such as artificial intelligence, robotics, the Internet of Things, autonomous vehicles, 3D printing, nanotechnology, biotechnology, materials science, energy storage and quantum computing.
Already, artificial intelligence is all around us, from self-driving cars and drones to virtual assistants and software that translate or invest. Impressive progress has been made in AI in recent years, driven by exponential increases in computing power and by the availability of vast amounts of data, from software used to discover new drugs to algorithms used to predict our cultural interests. Digital fabrication technologies, meanwhile, are interacting with the biological world on a daily basis. Engineers, designers and architects are combining computational design, additive manufacturing, materials engineering, and synthetic biology to pioneer a symbiosis between microorganisms, our bodies, the products we consume and even the buildings we inhabit.
Challenges and Opportunities
Like the revolutions that preceded it, the Fourth Industrial Revolution has the potential to raise global income levels and improve the quality of life for populations around the world. To date, those who have gained the most from it have been consumers able to afford and access the digital world; technology has made possible new products and services that increase the efficiency and pleasure of our personal lives. Ordering a cab, booking a flight, buying a product, making a payment, listening to music, watching a film or playing a game — any of these can now be done remotely.
In the future, technological innovation will also lead to a supply-side miracle, with long-term gains in efficiency and productivity. Transportation and communication costs will drop, logistics and global supply chains will become more effective and the cost of trade will diminish, all of which will open new markets and drive economic growth.
At the same time, as the economists Erik Brynjolfsson and Andrew McAfee have pointed out, the revolution could yield greater inequality, particularly in its potential to disrupt labor markets. As automation substitutes for labor across the entire economy, the net displacement of workers by machines might exacerbate the gap between returns to capital and returns to labor. On the other hand, it is also possible that the displacement of workers by technology will, in aggregate, result in a net increase in safe and rewarding jobs.
We cannot foresee at this point which scenario is likely to emerge, and history suggests that the outcome is likely to be some combination of the two. However, I am convinced of one thing — that in the future, talent, more than capital, will represent the critical factor of production. This will give rise to a job market increasingly segregated into “low-skill/low-pay” and “high-skill/high-pay” segments, which in turn will lead to an increase in social tensions.
In addition to being a key economic concern, inequality represents the greatest societal concern associated with the Fourth Industrial Revolution. The largest beneficiaries of innovation tend to be the providers of intellectual and physical capital — the innovators, shareholders and investors — which explains the rising gap in wealth between those dependent on capital versus labor. Technology is therefore one of the main reasons why incomes have stagnated, or even decreased, for a majority of the population in high-income countries: the demand for highly skilled workers has increased while the demand for workers with less education and lower skills has decreased. The result is a job market with a strong demand at the high and low ends, but a hollowing out of the middle.
This helps explain why so many workers are disillusioned and fearful that their own real incomes and those of their children will continue to stagnate. It also helps explain why middle classes around the world are increasingly experiencing a pervasive sense of dissatisfaction and unfairness. A winner-takes-all economy that offers only limited access to the middle class is a recipe for democratic malaise and dereliction.
Discontent can also be fueled by the pervasiveness of digital technologies and the dynamics of information sharing typified by social media. More than 30 percent of the global population now uses social media platforms to connect, learn and share information. In an ideal world, these interactions would provide an opportunity for cross-cultural understanding and cohesion. However, they can also create and propagate unrealistic expectations as to what constitutes success for an individual or a group, as well as offer opportunities for extreme ideas and ideologies to spread.
The Impact on Business
An underlying theme in my conversations with global CEOs and senior business executives is that the acceleration of innovation and the velocity of disruption are hard to comprehend or anticipate and that these drivers constitute a source of constant surprise, even for the best connected and most well informed. Indeed, across all industries, there is clear evidence that the technologies that underpin the Fourth Industrial Revolution are having a major impact on businesses.
On the supply side, many industries are seeing the introduction of new technologies that create entirely new ways of serving existing needs and significantly disrupt existing industry value chains. Disruption is also flowing from agile, innovative competitors who — thanks to access to global digital platforms for research, development, marketing, sales and distribution — can oust well-established incumbents faster than ever by improving the quality, speed or price at which value is delivered.
Major shifts on the demand side are also occurring, as growing transparency, consumer engagement and new patterns of consumer behavior (increasingly built upon access to mobile networks and data) force companies to adapt the way they design, market and deliver products and services.
A key trend is the development of technology-enabled platforms that combine both demand and supply to disrupt existing industry structures, such as those we see within the “sharing” or “on demand” economy. These technology platforms, rendered easy to use by the smartphone, convene people, assets and data — thus creating entirely new ways of consuming goods and services in the process. In addition, they lower the barriers for businesses and individuals to create wealth, altering the personal and professional environments of workers. These new platform businesses are rapidly multiplying into many new services, ranging from laundry to shopping, from chores to parking, from massages to travel.
On the whole, there are four main effects that the Fourth Industrial Revolution has on business — on customer expectations, on product enhancement, on collaborative innovation and on organizational forms. Whether consumers or businesses, customers are increasingly at the epicenter of the economy, which is all about improving how customers are served. Physical products and services, moreover, can now be enhanced with digital capabilities that increase their value. New technologies make assets more durable and resilient, while data and analytics are transforming how they are maintained. A world of customer experiences, data-based services and asset performance through analytics, meanwhile, requires new forms of collaboration, particularly given the speed at which innovation and disruption are taking place. And the emergence of global platforms and other new business models, finally, means that talent, culture and organizational forms will have to be rethought.
Overall, the inexorable shift from simple digitization (the Third Industrial Revolution) to innovation based on combinations of technologies (the Fourth Industrial Revolution) is forcing companies to reexamine the way they do business. The bottom line, however, is the same: business leaders and senior executives need to understand their changing environment, challenge the assumptions of their operating teams, and relentlessly and continuously innovate.
The Impact on Government
As the physical, digital and biological worlds continue to converge, new technologies and platforms will increasingly enable citizens to engage with governments, voice their opinions, coordinate their efforts and even circumvent the supervision of public authorities. Simultaneously, governments will gain new technological powers to increase their control over populations, based on pervasive surveillance systems and the ability to control digital infrastructure. On the whole, however, governments will increasingly face pressure to change their current approach to public engagement and policymaking, as their central role of conducting policy diminishes owing to new sources of competition and the redistribution and decentralization of power that new technologies make possible.
Ultimately, the ability of government systems and public authorities to adapt will determine their survival. If they prove capable of embracing a world of disruptive change, subjecting their structures to the levels of transparency and efficiency that will enable them to maintain their competitive edge, they will endure. If they cannot evolve, they will face increasing trouble.
This will be particularly true in the realm of regulation. Current systems of public policy and decision-making evolved alongside the Second Industrial Revolution, when decision-makers had time to study a specific issue and develop the necessary response or appropriate regulatory framework. The whole process was designed to be linear and mechanistic, following a strict “top down” approach.
But such an approach is no longer feasible. Given the Fourth Industrial Revolution’s rapid pace of change and broad impacts, legislators and regulators are being challenged to an unprecedented degree and for the most part are proving unable to cope.
How, then, can they preserve the interest of the consumers and the public at large while continuing to support innovation and technological development? By embracing “agile” governance, just as the private sector has increasingly adopted agile responses to software development and business operations more generally. This means regulators must continuously adapt to a new, fast-changing environment, reinventing themselves so they can truly understand what it is they are regulating. To do so, governments and regulatory agencies will need to collaborate closely with business and civil society.
The Fourth Industrial Revolution will also profoundly impact the nature of national and international security, affecting both the probability and the nature of conflict. The history of warfare and international security is the history of technological innovation, and today is no exception. Modern conflicts involving states are increasingly “hybrid” in nature, combining traditional battlefield techniques with elements previously associated with nonstate actors. The distinction between war and peace, combatant and noncombatant, and even violence and nonviolence (think cyberwarfare) is becoming uncomfortably blurry.
As this process takes place and new technologies such as autonomous or biological weapons become easier to use, individuals and small groups will increasingly join states in being capable of causing mass harm. This new vulnerability will lead to new fears. But at the same time, advances in technology will create the potential to reduce the scale or impact of violence, through the development of new modes of protection, for example, or greater precision in targeting.
The Impact on People
The Fourth Industrial Revolution, finally, will change not only what we do but also who we are. It will affect our identity and all the issues associated with it: our sense of privacy, our notions of ownership, our consumption patterns, the time we devote to work and leisure and how we develop our careers, cultivate our skills, meet people and nurture relationships. It is already changing our health and leading to a “quantified” self, and sooner than we think it may lead to human augmentation. The list is endless because it is bound only by our imagination.
I am a great enthusiast and early adopter of technology, but sometimes I wonder whether the inexorable integration of technology in our lives could diminish some of our quintessential human capacities, such as compassion and cooperation. Our relationship with our smartphones is a case in point. Constant connection may deprive us of one of life’s most important assets: the time to pause, reflect and engage in meaningful conversation.
One of the greatest individual challenges posed by new information technologies is privacy. We instinctively understand why it is so essential, yet the tracking and sharing of information about us is a crucial part of the new connectivity. Debates about fundamental issues such as the impact on our inner lives of the loss of control over our data will only intensify in the years ahead. Similarly, the revolutions occurring in biotechnology and AI, which are redefining what it means to be human by pushing back the current thresholds of life span, health, cognition and capabilities, will compel us to redefine our moral and ethical boundaries.
Shaping the Future
Neither technology nor the disruption that comes with it is an exogenous force over which humans have no control. All of us are responsible for guiding its evolution, in the decisions we make on a daily basis as citizens, consumers and investors. We should thus grasp the opportunity and power we have to shape the Fourth Industrial Revolution and direct it toward a future that reflects our common objectives and values.
To do this, however, we must develop a comprehensive and globally shared view of how technology is affecting our lives and reshaping our economic, social, cultural and human environments. There has never been a time of greater promise, or one of greater potential peril. Today’s decision-makers, however, are too often trapped in traditional, linear thinking, or too absorbed by the multiple crises demanding their attention, to think strategically about the forces of disruption and innovation shaping our future.
In the end, it all comes down to people and values. We need to shape a future that works for all of us by putting people first and empowering them. In its most pessimistic, dehumanized form, the Fourth Industrial Revolution may indeed have the potential to “robotize” humanity and thus to deprive us of our heart and soul. But as a complement to the best parts of human nature — creativity, empathy, stewardship — it can also lift humanity into a new collective and moral consciousness based on a shared sense of destiny. It is incumbent on us all to make sure the latter prevails.
Emily Whitehead was just 5 years old when she fell ill with acute lymphoblastic leukemia, the most common type of childhood cancer. Most kids beat the disease within two years, but Emily relapsed twice after several rounds of chemotherapy.
Desperate to find a treatment, her parents enrolled her in an experimental trial at The Children’s Hospital of Philadelphia (CHOP) in early 2012. CHOP’s doctors drew her blood and collected some of Emily’s T lymphocytes — white blood cells that form the backbone of the human immune system. The team then reprogrammed the cells by altering their DNA so they would recognize and kill Emily’s cancer. Doctors injected them back into Emily’s bloodstream, where the cells multiplied and wiped out the disease.
Emily has remained cancer-free since. She’s become the poster child for a new type of personalized medicine called cellular immunotherapy, a type of regenerative medicine that seeks to fix faulty cells and tissues by reengineering their DNA and restoring their normal function. “We are on the cusp of a revolution in medicine,” says Phil Vanek, general manager for cell therapy technologies at GE Healthcare. “These are living drugs. The reprogrammed cell itself becomes the treatment that gets reintroduced to the patient. Ordinary drugs wear off after we take them, but regenerative medicine can have a durable effect. It can restore a function that was lost or, someday, give our cells one they never had.”
Top image: Canadian Prime Minister Justin Trudeau (leaning) during a tour of CCRM’s labs on Wednesday. CCRM President and CEO Michael May is standing on the right. Image credit: GE Reports Above: T-cells attacking a cancer cell. Image credit: Getty Images
GE helped design the bioreactor CHOP doctors used to grow Emily’s cancer-trained T cells and has a number of products and services used in cell therapy manufacturing today. However, the company doesn’t want to innovate in a vacuum. Vanek believes the more collaboration he and his team can encourage in the industry, the faster these therapies get to the market. That’s why GE and the Canadian government will each invest CA$20 million in a new cell therapy research and process development hub in Toronto led by Canada’s Centre for Commercialization of Regenerative Medicine (CCRM). Canadian Prime Minister Justin Trudeau announced the news this week.
The new center will bring together researchers, universities, drug manufacturers and technology companies like GE to speed up the development and production of cell therapies and make them available to patients more quickly. “Cell therapy has the potential to cure everything from cancer to diabetes,” Vanek says. “But we need to make it affordable and scalable. In this era of increasing scrutiny of medical costs and reimbursement, we have to help our customers achieve a reasonable cost of goods.”
GE’s Xuri cell expansion system used for growing cells. Image credit: GE Healthcare Life Sciences
In many ways, Toronto is the perfect place for the new center. In 1961, James Till and Ernest McCulloch discovered stem cells here while working at the Ontario Cancer Institute and the University of Toronto. Stem cells will likely be key to regenerative medicine. They have the ability to turn themselves into many other different cells in the body, like blood, muscle and skin cells, and potentially repair damaged organs.
Today the city boasts a rich research and bioengineering ecosystem. Since CCRM opened in 2011, the not-for-profit has brought together 50 partners, including academic teams, pharmaceutical and technology companies, investors and regulators. The new center will be located inside Toronto’s 1-million-square-foot MaRS Discovery District, sharing the space with Facebook, Johnson & Johnson’s JLabs, Autodesk and venture investors. The main University of Toronto campus, a cluster of hospitals and the Ontario legislative assembly are right next door. “We need to integrate existing technology to break through manufacturing bottlenecks and make cell therapy affordable and available,” says CCRM President and CEO Michael May. “This rich environment will help us validate new clinical technologies and turn them into lucrative products.”
The new center will be located inside Toronto’s 1-million-square-foot MaRS Discovery District. Image credit: GE Reports
GE, one of CCRM’s founding partners, will play an anchor role in the new hub. The company has more than three decades of experience making biologics, the fastest-growing class of drugs, which includes best-sellers like Herceptin, Remicade, Humira and Avastin. Biologics are proteins grown by genetically modified cells cultured in tanks called bioreactors. GE makes bioreactors as well as other technology required to produce these drugs. “In the biopharmaceutical industry, we use cells in bioreactors to make these therapeutic proteins,” GE’s Vanek says. “We can use similar technology to multiply reengineered cells for cell therapy. Just like cells producing proteins, cells for therapy have to be washed, multiplied, purified, concentrated and packaged. We already know how to do that. But unlike bioprocessing, where the cells are discarded and only the protein is saved, we collect intact living cells and this introduces added complexity to the process.”
Vanek has other tools in his kit. In many ways, GE Healthcare’s cell therapy business is the perfect example of what the company calls the GE Store — the ability to share knowledge across many domains. The team works with GE’s bioprocess business in Uppsala, the HyClone team in Cardiff, software engineers in Bangalore and San Ramone, scientists at GE Global Research and GE Ventures. They will be also using Predix, the cloud-based software platform for the Industrial Internet developed by GE Digital. “Our fate is determined by these collaborations,” Vanek says.
For example, cellular immunotherapy starts with the patient’s own cells. Vanek says Predix could allow manufacturers to keep a digital chain of custody of those cells. “We call it the digital thread and already use it in manufacturing,” he says. “It could allow us to track cells through the whole reprogramming and manufacturing process. We could run real-time analytics during the process and then ultimately follow the patient results. Since every patient introduces a level of biological variability, we envision smart systems in the future that could adapt the production process in real-time depending on the biology that’s coming in.”
Last year GE launched the GE Health Cloud, which allows hospitals to collect and work with patient data. Vanek says that, to be broadly adopted, cell therapy manufacturing processes will have to be flexible, push-button, sterile and disposable.
The new center will help, says CCRM’s May. He believes that better-designed cell therapy tool kits for clinical production could be available within a few years, and “manufacturing blueprints” by the end of the decade. Says May: “The new investment will help us break through the manufacturing bottlenecks and get us there.”
When GE Aviation bought the storied but small Czech turboprop builder Walter Aircraft Engines in 2008, the American company hadn’t developed a propeller engine in decades. Companies like Pratt & Whitney Canada dominated the market, while GE focused chiefly on making powerful jet engines for passenger planes and military jets.
But engineers in both in Prague and in the U.S. hunkered down and have spent the last seven years developing a new advanced turboprop engine (ATP) that could unlock the lucrative space. The bet paid off last fall when Textron Aviation, the world’s largest maker of business propeller planes, announced it would use the new engine for a brand-new plane it’s developing. GE’s turboprop business is about to take off.
GE Aviation said today it would use a portion of its $400 million investment in Europe to build its new turboprop development, test and engine-production headquarters in the Czech Republic. The center, which will employ more than 500 workers and engineers, will make the new engine for Textron and other customers beginning in 2020. “We like to build things in the Czech Republic and there is a deep pool of engineering talent in the country,” said Paul Corkery, GE’s ATP program manager. “We’ve been building aircraft engines here since the early days of aviation, but this new center will take it to a whole new level.”
The advanced turboprop, called GE93, burns 20 percent less fuel and produces 10 percent more power, compared to engines in its class. Image credit: GE Aviation Top image: A GE h 80 engine inside a test cell in Prague. Image credit: GE Aviation
The advanced turboprop, called GE93, burns 20 percent less fuel and produces 10 percent more power compared to engines in its class. It will allow pilots to carry less fuel for the same mission, said Brad Mottier, vice president of business and general aviation and integrated systems at GE Aviation. Mottier says that “jetlike controls” in the cockpit will allow Textron to “design a different class of aircraft.”
Josef Walter opened his bike shop in Prague in 1898, progressed to motorcycles and car engines, and built the first aircraft engine in 1923. Image credit: GE Reports
Like the Wright brothers, Josef Walter, the founder of Walter Aircraft Engines, started out by building bicycles. He opened his bike shop in Prague in 1898, progressed to motorcycles and car engines, and built the first aircraft engine in 1923.
Walter’s design duo Novak-Zeithammer built the NZ-60 engine that started it all. This example is in the collection of the Aviation Museum in Prague. Image credit: GE Reports
Czechoslovak State Airlines signed up as one of the first customers for the plane engines in the 1920s. Within a decade, they covered nearly 2 million miles in the carrier’s service. By 1936, the company was producing 18 different engines in Prague. Four other factories in Spain, Italy, Yugoslavia and Poland were making them under a license. The national air forces of 13 countries were using Walter engines, which served in a total of 21 countries. Later, the company also ventured into jets and made new engines even under four-decades-long Communist rule.
Following the 2008 acquisition, GE brought in new machines, investment and know-how and moved the business to a new factory. The company also revamped its product line and found new customers for Walter’s workhorse H80 turboprop engines. They now fly even to Lukla at the foot Mount Everest, the world’s most dangerous airport.
“Walter has always been an iconic aviation brand,” says Zdenek Soukal, commercial director of GE Aviation Czech. “It’s great to see it soar again.”
Collaboration is becoming an increasingly powerful innovation tool, but it’s important to harness its power for good. Here’s how.
Never before in history has knowledge been able to have so much power.
The primary reason is that the world is more connected than ever before, but it goes further than that. In an increasingly networked world, the ability to reach across disciplines, time and space and to work together is rapidly accelerating. Naturally, the pace of discovery has picked up. With every change in communication pathways over time, collaboration has benefitted. Today is a particularly exciting time because the change includes a significant bridge between the digital and physical worlds.
It’s my belief that in the next five years, collaboration will bring us marked benefits in our built world and the systems we use to measure, improve and preserve it. However, it’s critically important to proceed with caution. Not everything about collaboration can be seen as a benefit. Businesses need to understand the pitfalls also. Before we get into those pitfalls, let’s first take a look at three reasons why collaboration has become such a powerful tool for business:
1. Digitization
In the days of Stradivarius, violins were made by passing verbal or hand-written knowledge from one master to one apprentice. Digitization fundamentally changed how we learn to make a product like a violin because of the increased potential of collaboration. Files for both instruction and assembly can now drive machines and analysis software to an infinite degree, thereby promoting rapid development. It evens the playing field so that the best ideas become the top notes in the orchestras of the future.
2. Protection of User-Generated Content
The protection of user-generated content has been the necessary handmaiden in a world in which one person’s ideas can so readily flow around a connected network. Basic human morals or ethics like attribution, liability and reward can now be tracked and described and codified into law. The advent of whole constructs — such as creative commons or GNU licensing — have greatly assisted in this process. This means that collaborators should feel increasingly safe in sharing their thoughts.
3. Changes in Workplace Law and Liability
The last century saw the advent of unionization in response to the tyranny of scientific management. Workers needed to leverage their collective strength to balance the demands of management so that metrics like profit did not override basic standards of decency.
In a world of collaboration, collective bargaining falls short in providing opportunity for individuals formally identified with a union to speak and achieve for themselves outside of their 20th century “job.” Liability has followed a similar change and challenge because, with increased collaboration, shared responsibility for defective or pernicious guidance can now be recognized and liability ascribed. Those who collaborate would be well-advised to know their rights and responsibilities before sharing ideas.
As collaboration becomes an increasingly powerful tool for businesses and their employees, now is the time to understand how to harness this power for good. It can be a tricky thing at times. Here are a couple of illustrations:
1. Sharing Information Versus Sharing Time
In an increasingly collaborative world, there are limits after which we see not only diminishing marginal returns, but also the negative effects of collaboration. To be specific, sharing information and sharing time are particularly different forms of collaboration. Information is an infinite resource, whereas time is finite. In most projects, the people who share their time are often the ones who receive the most requests to share their time in the future. These people are particularly at risk for burning out at work.
On the contrary, those who share information often experience a higher degree of collaborative success. This difference highlights one of the many pitfalls to be found in the growing pursuit of collaboration.
2. Collaboration Versus Protection
Another potential pitfall is how to deal with intellectual property, which encompasses everything from patents to trade secrets, to trade dress and even up to national secrets. Collaboration challenges the reasoning behind all forms of IP. At Local Motors, we believe the best use of IP historically has been to ensure disclosure and credit authorship.
Unfortunately, the business world of late has discovered the difficulties of perfecting IP, which has slowed collaboration and therefore blocked innovation. This pitfall must be examined along the spectrum of types of IP to decide where collaboration might be used to achieve shared situational awareness and rapid iteration on ideas.
Collaboration has created tremendous benefits in the business world, but there are times and places where it needs to be reigned in. Determining how and when to do that will continue to be a challenge for businesses of all types in the coming years. The ones who do it best are sure to be ahead of the competition.
Image may be NSFW. Clik here to view.Jay Rogers is CEO and Co-Founder of Local Motors, a technology company that designs, builds and sells vehicles through the use of co-creation and microfactories.
The 4th Industrial Revolution is study in contrasts, as the GE Global Innovation Barometer illustrates. Optimism about the power of innovation to address some of society’s greatest challenges, mixed with fear of “Digital Darwinism” and becoming obsolete. A growing recognition of collaboration as a competitive advantage, combined with the empowerment of individuals who have access to an increasing array of digital tools.
For a glimpse of how these trends are shaping the future of work and the global competitive landscape, we’ve asked several thought leaders to share their outlook on the digital revolution — and animators to bring those visions to life:
The latest GE Global Innovation Barometer shows that both business leaders and informed citizens are looking to innovation with much greater optimism. This is a powerful, encouraging sign. The global economy has recovered from the great financial crisis; we are now moving away from a period where global growth has been supported by exceptional policy measures — quantitative easing, zero interest rates — and toward a new equilibrium where economies have to stand on their own legs. Human talent, ingenuity, collaboration and entrepreneurship will be essential to secure stronger, sustainable growth. And there is a lot less fear about the impact of innovation on jobs than I expected. Business leaders and informed citizens realize that the new world will be one where humans and machines work side by side, in a way that increases safety and performance.
Still, I think people do not yet fully realize how digital technologies will make existing jobs better, more rewarding and productive. Innovation will create new and better job opportunities. The workforce, though, will need to go through a significant transformation: the jobs of the future require problem-solving abilities, flexibility and creativity. Education and training need to catch up fast. The GE Innovation Barometer confirms that the digital-industrial revolution is real, and it is already here. The high share of business leaders and citizens who have already learned to make better use of data and collaborate more —and are reaping the first benefits — are enthusiastic about this new revolution. Let’s make it happen.
Amit Narayan: How Digital Innovation Can Tackle Global Challenges
As we enter the 4th Industrial Revolution, technological innovations —particularly advances in software — are increasingly being used to address some of the world’s most pressing issues. Perhaps nowhere is software’s ability to solve our most difficult challenges more apparent than in energy. Big Data, artificial intelligence, machine learning and the Internet of Things (IoT) are enabling us to transform the electric grid — making it cleaner, more affordable and more reliable. It is true that software can’t actually generate electricity. But it can allow us to maximize the value of the power we generate through effective use of data. In doing so, we can reduce our dependence on dirty fossil fuels and improve the effectiveness of energy efficiency, renewable energy and energy storage technologies — and create a more efficient, carbon-free electric grid. Software will turn data into a new source of power. This transformation will allow us to cost-effectively extend the benefits of electricity to 1.1 billion of people in the developing world without access to reliable electricity and accelerate our transition to a fossil-fuel free economy. When it comes to energy, software will not eat the world — it will help save it.
Don Butler: The Future of Innovation Requires Collaboration
Innovation and collaboration are inextricably paired. At Ford, it is part of our DNA, from harnessing the power of the assembly line more than one hundred years ago, to partnering with cutting-edge companies today as we continue redefining the way the world moves. In today’s connected world, a diversity of perspective is needed. That’s why we’re working with Amazon to connect your car to your home, and why we’re sponsoring technology competitions and providing software platforms to foster the next generation of innovators working to dream up the future of everything from smartphone connectivity to mobility solutions to the customer experience. Our collaboration efforts are all about making people’s lives better which results in a strong business. More than two-thirds of the executives surveyed in the GE Innovation Barometer said that collaborative innovation activities have yielded an increase in financial results. So it’s no surprise that 68 percent of executives said their firm is open to risk-sharing associated with innovation. They understand that the future of innovation — and of their companies — requires collaboration. At Ford, we’re driving innovation in all parts of our business by encouraging our team to take risks, challenge custom and question tradition. Within the auto industry, this means working together to develop connected car technology that can help prevent accidents before they happen — all possible because of collaboration. With innovation happening at a rapidly expanding pace around the world — across borders and between industries — companies that embrace the spirit of collaboration will excel.
Don Butler is Executive Director, Connected Vehicle and Services, at Ford MotorCompany.
Kakul Srivastava: How Software Collaboration Will Unleash the Next Wave of Innovation
At GitHub, we believe that democratizing the software development process will be critical to the next wave of innovation. When companies break down barriers for internal teams, and invite the broader community to participate in building software openly, the speed at which we advance technology will increase exponentially. We see it in our community every day — whether it’s the doctors working on software to 3D print stethoscopes for doctors in conflict zones, or the scientists at NASA JPL building software to help us get to Mars, or the farmer in Canada who forked a piece of driverless car software to build an automated tractor to work his field more efficiently. Marc Andreessen famously said a few years ago that “ software is eating the world.” In our view, it’s becoming clear that it’s the other way around — the world is eating software. By lowering the barrier to entry to software development and expanding the pool of people working together in the open, we will be able to reach new heights of technological advancement and explore new worlds.
Jake Schwartz: Talent Is the Key to Surviving Digital Darwinism
The fear of “Digital Darwinism” is palpable around the world, with 81 percent of executives surveyed by the GE Innovation Barometer mindful of a fear of becoming obsolete. While tying survival to innovation is a valid mentality, the global race for talent should be top of mind. The greatest thing companies can do to stay competitive in today’s evolving digital landscape is to focus on talent — the top factor cited by executives in the Barometer for innovating successfully. Talent is the building block of an organization, and the current moment represents an opportunity to build real advantage through their teams. This means bringing in new talent that is fluent with the trends and relevant 21st century skills, but it also means truly investing in the development of the teams already place. Employees value their growth and development above all else, and in that they are 100 percent aligned with the imperatives of the contemporary business environment of continuous change. Valuing people and investing in their skills will fuel a positive and consistently relevant corporate culture — and will help avoid the risk of becoming obsolete.
Jake Schwartz is Co-founder & CEO of General Assembly, a global education company, empowering individuals and companies to be successful in the digital age.
The 2016 GE Global Innovation Barometer surveys business executives in 24 countries around the globe. Use the tools below to compare results in each country and create your own customized view of the survey findings for each topic.
What innovative talents do you bring to the job? Are you the creative type, a problem-solver, able to navigate uncertainty with ease? You might be in the wrong place — even the wrong country — to optimize your talents. In fact, different innovative skills are valued more in some countries than others, as the GE Global Innovation Barometer shows. Take this quiz to find out whether your innovative talents would thrive more in Paris, Tokyo or a different innovation hub:
State leaders, powerful executives, prominent philanthropists and influential thinkers are getting together for their annual gathering in Davos this week. As usual, Marco Annunziata, GE’s chief economist, is also making the pilgrimage to the snowy Swiss valley. GE Reports caught up with him just as he was packing his bags. We talked about the importance of measuring innovation and GE’s Global Innovation Barometer, which the company released in Davos on Tuesday.
“There is now a much greater understanding that the industry of the future is one where humans and machines will work side by side,” says GE’s Marco Annunziata. Image credit: GE Reports
GE Reports: GE has been publishing the Innovation Barometer (IB) since 2010. Why?
Marco Annunziata: There is a lot of debate, and a lot of confusion, on the state of innovation: How much of it is happening? Is it helping businesses and society? What holds it back? The IB has enormous value because it gives us an insight — and data — into all these issues.
GER: Who is your audience?
MA: It’s really everyone, but there are three groups with whom it resonates the most. Economists like me can get a better sense of how quickly innovation is taking place, in which countries it is likely to move faster, what implications it will have on economic growth, jobs, living standards. Policymakers can better understand how to foster innovation: whether they need to change regulations, strengthen the education system, help facilitate funding. The answer will often vary across countries. They can also benchmark themselves to understand which countries are better at innovation and identify and follow the best practices.
Finally, business leaders can get an invaluable insight into how their peers are doing: Are they making progress on collaborative efforts? Do they see securing top talent as a priority? How much are they already benefiting from innovation? Do they bet on disruptive or incremental innovation? The IB is an unmatched opportunity to benchmark yourself against the competition, to get a broad and deep understanding of how important it is to innovate today, and what it takes to do it successfully.
GER: What jumps out at you from this year’s edition?
MA: To me, the most interesting and surprising finding of the 2016 barometer is that people are not as afraid of innovation as most press headlines would have you believe. Nowhere near. Only 17 percent of business executives and 15 percent of the informed public expect digital-industrial innovation to have a negative impact on jobs. This is a stunning result that flies in the face of all the scaremongering articles telling us that innovation will destroy jobs. And I think the reason is that digital industrial innovation is already here, it is already taking place, and people can see that it does not destroy jobs. There is now a much greater understanding that the industry of the future is one where humans and machines will work side by side, and that this will result in more and better jobs. It is part of the more general attitude toward innovation you can see in the barometer, but for me this is the most surprising — and inspiring — result.
GER: Why did GE start tracking innovation?
MA: Measuring innovation sentiment is important for two reasons. From a business perspective, you must invest in innovation if you want to innovate. And investment, ultimately, depends on the “animal spirits” of business leaders and engineers — as Keynes used to call them. It depends on whether they feel optimistic, ready to take the necessary risks. From a policy perspective, it is important to take the pulse of the public’s sentiment on innovation because it will help shape the policies and regulations that can facilitate innovation or hold it back.
GER: Are there any other tools like the IB?
MA: As far as I have seen, the Innovation Barometer is a unique instrument. It is unique because it focuses on the perspective of innovation, whereas other indicators and surveys and look at broader issues of competitiveness or economic development. It is unique because it was designed by people who live and breathe innovation, and it surveys business leaders who are directly involved in the strategic innovation efforts of their companies. And it is unique because it is a living instrument, where the questions can adapt from one year to the next to give us the best sense of how the sentiment toward innovation and the environment for innovation are evolving.
Thomas Edison, one of the world’s most prolific inventors, said his main purpose in life was to make enough money to create ever more inventions. “I find out what the world needs,” Edison declared. “Then I go ahead and try to invent it.”
That sentiment still resonates. GE’s 2016 Global Innovation Barometer, an annual survey that explores how business leaders and the public see innovation, found that they viewed the most innovative companies as those “who create entirely new markets or products, rather than improving or iterating on exiting ones.”
The results, which were released today on the eve of the World Economic Forum in Davos, found that respondents were optimistic about the business potential of digital technologies and big data and “felt curious” about “the fourth Industrial Revolution” they have have set off. However, many of them feared being left behind by technologies evolving faster than they could adapt, and most still favored an incremental approach to innovation that mitigated risk.
“There is now a much greater understanding that the industry of the future is one where humans and machines will work side by side,” says GE Chief Economist Marco Annunziata
Marco Annunziata, GE’s chief economist, said he was “stunned” and “inspired” by the Barometer’s findings on jobs. Only 17 percent of business executives and 15 percent of the informed public expected the digital-industrial innovation to have a negative impact on jobs, he said. “This is a stunning result that flies in the face of all the scaremongering articles telling us that innovation will destroy jobs,” Annunziata said. “I think the reason is that digital-industrial innovation is already here, it is already taking place, and people can see that it does not destroy jobs. There is now a much greater understanding that the industry of the future is one where humans and machines will work side by side, and that this will result in more and better jobs. It is part of the more general attitude toward innovation you can see in the barometer, but for me this is the most surprising — and inspiring — result.”
The epicenter of the digital revolution has been Silicon Valley and the U.S. has remained global innovation champion. Japan climbed to No.2. However, the barometer detected the highest excitement about innovation in emerging economies like Indonesia, Nigeria and Turkey. They reported feeling greatly empowered by the global digital transformation.
“This year’s Barometer highlights the enormous pressure put on businesses to disrupt themselves,” said Beth Comstock, GE vice chair for innovation. “But a belief in the transformative power of innovation persists, propelling them forward as they struggle and adapt to today’s increasingly competitive business environment.”
Edison, whose power plants and electric machines helped build the foundation on which GE stands today, would have agreed. Explore the Innovation Barometer results in full and find more related stories, quizzes and infographics on its dedicated page here.
By creating new ways of learning and working, we not only can close the skills gap — but unlock the economy’s growth potential.
Ask CEOs what their top challenge is and they will tell you: recruiting and retaining skilled talent across their enterprises.
While the problem may be global, the solutions often play out on a local level. Take Huntington Ingalls Industries, a shipbuilder in Virginia. CEO Mike Petters describes his central challenge — turning novices into skilled shipbuilders. Since that’s not a degree offered in many schools, they built their own school. The result: not only have they surpassed 10,000 graduates from their flagship school, but graduates routinely stay at the company for the long haul.
While each industry or company may find its own solution to solving the skills gap, much can be shared and learned among everyone who faces this problem. That’s why I was gratified to see that Employment, Skills and Human Capital is featured as a key challenge area of the annual World Economic Forum meeting in Davos later this month.
About a year ago, Business Roundtable partnered with Change the Equation to ask CEOs just how big a problem the skills gap actually is. Despite the broad coverage of the issue, and despite its importance to many CEOs, we were still surprised by the results.
Nearly 98 percent of CEOs said that the skills gap was a problem for their companies. The gap affected all levels of jobs — from entry level to those requiring experience — and CEOs particularly struggled to find candidates with the STEM (science, technology, engineering and mathematics) skills needed for technical positions that are the foundation of economic growth.
These results prompted us to step up our efforts to identify and support effective programs in the increasingly connected worlds of education, training and work. One year later, we have some encouraging progress to report, but also a clear understanding that righting this ship will take some time. It will take a multipronged approach to confront this challenge.
It all starts with our longstanding role as a leading voice in K-12 education in the United States, especially for elementary education. Business Roundtable remains vocal in supporting challenging academic standards in primary and secondary education, benchmarked against the best ones used by our global competitors. States should maintain flexibility in what precise standards they adopt, but they must be tied to what colleges, employers and the military expect from students.
In Washington, we continue to urge policymakers to enact legislation that modernizes the U.S. education system and raises the bar for students and schools alike, such as the Every Student Succeeds Act.
The focus must continue after high school graduation, as well. That is why our CEOs are calling for a modernizedHigher Education Act and are working to ensure career and technical education programs authorized under the Perkins Actalign with employer needs.
Government, of course, cannot solve every problem. Business Roundtable is also building public-private partnerships to ensure college-level students are prepared to meet employers’ needs on the job in key economic sectors. Working with the Business-Higher Education Forum, we’re bringing together leaders from business and higher education to transform undergraduate education in emerging fields, such as financial services data analytics.
Closing the skills gap for the long run, however, will require building a nation of life-long learners. First steps toward that goal include a transformation of the relationship between employers and employees — as well as better collaboration between business and community colleges. To zero-in on these priorities, Business Roundtable and ACT Foundation launched the National Network of Business and Industry Associations, an innovative partnership that joins 25 organizations focused on better connecting learning and work.
rethink how various professional organizations build credentials to help workers move easily between professions; and
increase the use of competency-based hiring practices across the entire economy.
If we get this right, we can do more than just close the skills gap. We can create new ways of learning and working, which will unlock the potential of employees and generate new growth that benefits all of us. This will put more people in more jobs, which will grow the economy and create even more jobs for tomorrow — and beyond.
For the CEOs who will be on hand in Davos, their message will be clear: that brighter future begins now.
Please send me an email if you would like to learn more or get involved with any of these efforts.
(Top image: Courtesy of Thinkstock)
Image may be NSFW. Clik here to view.Dane Linn, a Vice President at Business Roundtable, oversees the Education & Workforce Committee, advancing Business Roundtable positions on education reform, U.S. innovation.
For most people, the term “next generation” isn’t the first thing that comes to mind when they think of coal. After all, everything about it is old.
The sedimentary rock, composed of ancient fossilized plants, has served as a fuel source for millennia. More than 2,000 years ago, people in China used it to keep warm and to smelt copper. Later on, it fueled the machinery that powered the Industrial Revolution. Then, with the advent of electricity, it heated the steam that turned power plant generators.
When Thomas Edison built the world’s first central power plant on Pearl Street in downtown Manhattan in 1892, it used six coal-burning steam engines. Today, coal supplies nearly 30 percent of global energy consumption — its highest share since 1970 — and provides 40 percent of the world’s electricity. While this number is expected to drop to 30 percent by 2040 (34 percent in America), coal will remain the backbone of the power systems in many countries, despite new capacity coming from natural gas and renewables. In southeast Asia, for example, where energy demand is projected to spike by 80 percent, coal will become the single largest energy source in the region’s energy mix, owing to its abundance and relative affordability.
Top image: RDK 8’s “ultra supercritical” steam turbine. The water pressure inside reaches 4,000 pounds per square inch, more than what’s exerted when a bullet strikes a solid object. The water, which exists in a “supercritical state,” is heated to 1,112 degrees Fahrenheit (600 degrees Celsius). Above: Edison’s “jumbo dynamo” at the wold’s first power station in Lower Manhattan. The power plant’s efficiency was just 1.6 percent. Image credit: Museum of Innovation and Science Schenectady
“You simply can’t close your eyes to coal,” says Olivier Le Galudec, a power plant engineer who is the head of performance calculation and testing at GE Steam Power Systems. “We’ll still need it going forward to provide a good amount of our power. We need to do this while achieving emissions targets, and we need modern, responsible coal-based generation that makes the maximum possible use of all the heat from the fuel.”
There has always been a great deal of room for improvement. The first steam-turbine generator, which started operating just two years after Pearl Street Station, converted only a measly 1.6 percent of heat from coal into electricity. In modern terms, this was a laughably inefficient machine.
This massive silver cigar called the feedwater tank holds water before it enters the boiler. The tank can store 100,000 gallons of water heated high above its boiling point to 380 degrees Fahrenheit (193.3 Celsius). To improve efficiency, the plant extracts some of the steam from the water-steam cycle to reheat it. Image credit: GE Power
Performance has improved over time, reaching 20 and then 30 percent efficiency. Every bump represented a smaller burden for the environment, public health and costs to utilities. “The efficiency of converting coal into electricity matters: more efficient power plants use less fuel and emit less climate-damaging carbon dioxide,” wrote the authors of the International Energy Agency report on measuring coal plant performance.
Moving the current average global efficiency rate of coal-fired power plants from today’s 33 percent to 40 percent by deploying more advanced technology could cut carbon dioxide (CO2) emissions every year by 2 gigatons, which is the equivalent of India’s annual CO2 emissions.
Most scientists believe that greenhouse gases like CO2 contribute to climate change. Just yesterday NASA and the National Oceanic and Atmospheric Administration (NOAA) reported that 2015 was the warmest year on record since 1880, when they started recording the data.
The boiler, in brown, pressurizes and heats the water to 1,112 degrees Fahrenheit (600 Celsius). At very high pressure and temperature, water becomes a supercritical fluid, a phenomenon where it no longer has specific liquid and gas phases. Image credit: GE Power
The efficiency race just scored a new record. GE said a plant at the Rheinhafen-Dampfkraftwerk electrical generation facility in Karlsruhe, Germany, which is using its technology, has achieved a 47.5 percent net thermal efficiency while producing 912 megawatts of electricity. Le Galudec says no hard-coal-fired steam power plant has been able to top this number.
The plant, called RDK 8, surpassed the previously recognized record holder at the Nordjylland plant in Denmark, which achieved 47.1 percent efficiency, and achieved a full 10 percent better efficiency than the average of all currently operating German coal-fired plants. “I’m not aware of an operating plant that we’ve tested or that of our competition that beats the RDK 8,” Le Galudec says. “This is a benchmark — the highest thermal efficiency we’ve yet seen for a coal plant. It’s really sky-high.”
Boiler mills crush coal into a fine powder that turns into a fireball inside the boiler and heats the water. Image credit: GE Power
Higher efficiency means that the plant produces less CO2 by using less coal to generate the same amount of electricity as the other plants. RDK 8, operated by the third-biggest German utility, EnBW, reduces specific CO2 emissions by 40 percent compared to the global average conventional coal-fired fleet. Connecting the RDK 8 plant to Karlsruhe’s district heating system boosts fuel utilization to levels above 60 percent.
The plant has achieved better efficiency through an exacting application of science. It relies on something called ultra-supercritical steam to squeeze out better performance. When it is operating, RDK 8 puts steam, as it enters the high-pressure steam turbine, under pressure of around 4,000 pounds per square inch (275 bar in metric terms), more than what’s exerted when a bullet strikes a solid object. It also heats the water to 1,112 degrees Fahrenheit (600 degrees Celsius).
A 3D rendition of the power plant. The boiler is brown, the horizontal water feeder white and the ultra supercritical turbine is purple. The generator (in yellow) is attached to the turbine. It can produce 912 megawatts. Image credit: GE Power
Considerably below this pressure and temperature, water becomes a supercritical fluid, a phenomenon where it no longer has specific liquid and gas phases. Instead, it exhibits properties of both at the same time. In this state, supercritical steam becomes much more efficient at driving the turbines that spin the electricity-producing generators. “With systems like RDK 8, we’re deep in the supercritical fluid territory, where both the quantity and quality of energy allow us to extract more energy from the steam to turn into electricity,” Le Galudec says. “This is a major contributor to high-efficiency systems.”
The science doesn’t stop with higher efficiency. Similar to emission control used on cars, power plants like RDK 8 have sophisticated systems to remove contaminants like gas oxides and particulates from flue gases. Europe has some of the most stringent emission regulations in the world, and today’s technology can meet and in certain cases exceed these standards.
An exterior view of the RDK 8 power plant. Image credit: GE Power
Le Galudec says that the efficiency bumps engineers are seeing by pushing back the boundaries of supercritical steam temperature and pressure mean that there is still more room for improvement.
GE is already looking into what happens when you bring steam temperatures up to 1,300 degrees Fahrenheit (700 degrees Celsius). “We’ll do better than what we’ve just achieved, because every day we work to go higher,” he says. “Raising the temperature will keep pushing efficiency and after that there will be other ways. I am absolutely sure the race to efficiency is not over.”