Quantcast
Channel: All Posts
Viewing all 2658 articles
Browse latest View live

The Road to ecoROTR: How Building a Better Wind Turbine Began With an Online Shopping Spree for Styrofoam Balls

$
0
0
The Road to ecoROTR: How Building a Better Wind Turbine Began With an Online Shopping Spree for Styrofoam Balls 0

By Zack Lord

Scientists at GE Global Research spent the last four years building a more efficient wind turbine. The result rises 450-feet above the Mojave desert in California – almost half the height of the Eiffel Tower — and looks like it has a silver UFO stuck to its face. It may appear strange, but you are looking at the future of wind power. The team explains how it came about.

In 2011, Mark Little, GE’s chief technology officer and the head of the GRC, challenged principal engineer Seyed Saddoughi and his team to build a rotor that could harvest more wind. Michael Idelchik, who runs advanced technology programs at the GRC, gave them another clue: “Since we know that the inner parts of wind turbines don’t do much for energy capture, why don’t we change the design?”

image

The team came up with the idea of putting a hemisphere on the center part of the wind turbine to redirect the incoming wind towards the outer parts of the blades. “The biggest unknown for us was what size the dome should be,” Saddoughi says.

The group decided to do some experiments. They bought on the Internet a 10-inch wind turbine and a bunch of Styrofoam balls of different sizes, then took the lot to a wind tunnel at GE’s aerodynamic lab (see above). “By cutting the Styrofoam balls in half, we created our domes of different sizes and then stuck these domes on the center of the small wind turbine and ran our experiments at different tunnel air speeds,” Saddoughi says.

image

The team hooked up the turbine to their instruments and measured the amount of voltage it produced. “Invariably we got a jump in voltage output with the dome placed at the center of the wind turbine; albeit the increases differed for different size domes,” Saddoughi says.

The scientists reached out to a colleague who did simple computer simulations for them and confirmed that even a full-size turbine was more efficient with a nose upfront. “Of course overjoyed by the very limited experimental and computational results, we wanted to come up with a name for this design, such that it really represented the idea – and was also something that everybody would remember easily,” Saddoughi says. “The team gathered in my office again, and after an hour of playing with words the name Energy Capture Optimization by Revolutionary Onboard Turbine Reshape (ecoROTR) was created.”

imageimage

image

Above: Saddoughi is attaching differently shaped noses and turbine blades in Stuttgart. All image credits: GE Global Research and Chris New (ecoROTR)

image

The team then built a 2-meter rotor model of the turbine and took it for testing to a large wind tunnel in Stuttgart, Germany. The tunnel was 6.3 meters in diameters and it allowed them to dramatically reduce the wall effects on the performance.

The researchers spent couple of months working in Stuttgart. “We conducted a significant number of experiments at the Gust wind tunnel for different tunnel air velocities and wind turbine tip-speed ratios with several variations of domes,” Saddoughi says. “The wind tunnel was also operated at its maximum speed for the blades in feathered configurations at several yaw angles of the turbine to simulate gust conditions.” They ran the turbine as fast as 1,000 rpm and carried out surface dye flow visualization experiments (see below).

image

Above: When dye hits the fan. Saddoughi after the dye flow visualization.

When they came back in the second half on 2012, they started designing the actual prototype of the dome that was 20 meters in diameter and weighed 20 tons. The size presented a new batch of challenges. “Unlike gas or steam turbines that are designed to operate under a relatively limited number of set conditions, wind turbines must operate reliably and safely under literally hundreds of conditions, many of them highly transient,” says Norman Turnquist, senior principal engineer for aero thermal and mechanical systems.

image

They ran more calculations to make sure that
GE’s 1.7-megawatt test turbine in Tehachapi, Calif., would be able to support the dome. They looked at performance during different wind speed and directions, storms and gusts. They also designed special mounting adapters and brackets to attach the dome. “The design looked really strange, but it made a lot of sense,” says Mike Bowman, the leader of sustainable energy projects at GE Global Research.

The team then assembled the dome on site. “Early on, it was decided that the prototype dome would be a geodesic construction,” Turnquist says. “The reason is simply that it was the construction method that required the least amount of unknown risk.”

For safety reasons, the workers assembled the dome about 300m from the turbine and used a giant crane to move it to the turbine base for installation. But there was a hitch. “After the adapters were mounted to the hub it was discovered that bolt circle diameter was approximately 8mm too small to fit the dome,” Turnquist says. The team had to make custom shims to make it work.

imageimageimage

The dome went up in May on Memorial Day and the turbine is currently powering through four months of testing. “This is the pinnacle of wind power,” says Mike Bowman. “As far as I know, there’s nothing like this in the world. This could be a game changer.“

image


Pills on Wheels: GE is Building the World’s Largest Modular Biologics Factory

$
0
0
4

 

Ordering stuff online and having it shipped to your house is now as common as breathing air. But the Taiwanese manufacturer of biologics, JHL Biotech, recently upped the ante and ordered an entire high-tech pharmaceuticals factory. Made by GE in Germany, Sweden and the U.S., the components for the world’s largest single-use modular plant for making biopharmaceuticals, which the company calls KUBio, recently left Europe for JHL’s new site in Wuhan, the capital of China’s Hubei Province.

When the sixty-two completed modules that make up the factory reach the destination at Wuhan’s Biolake Science Park , they will help JHL make affordable biologics for markets where they are otherwise prohibitively expensive.

Biologics, also called biopharmaceuticals, are a new class of medicines made from strings of complex proteins. They are now leading the charge against disease and represent the fastest growing class of drugs. They range from synthetic insulin to medicines that can be used to treat cancer, rheumatoid arthritis and other diseases.

image

“Our vision is to make world-class biopharmaceuticals affordable and accessible to all patients,” says Racho Jordanov, JHL’s chief executive. “This revolutionary modular facility is part of the realization of our vision in Asia, where US-made biopharmaceuticals are out of reach, and there is a large unmet medical need.”

imageimage

Manufacturing drugs in single-use disposable plastic containers eliminates the need for costly cleaning and sterilization. It means that facilities can be smaller, and more efficient. They can be also configured to switch quickly between different drugs.

GE Healthcare’s KUBIo includes everything from bioprocessing equipment to the building and overall project coordination. The modules at the site arrive 80 to 90 percent pre-equipped with the heating, ventilating, and air conditioning (HVAC) system, the clean room, most of the utility equipment, and all of the piping necessary to run the plant.

image

“It’s really very innovative and different, because today 98 percent of biopharma factories are still stick-built, meaning you design and construct the building first, then it takes around a year to get it up and running,” says Olivier Loeillot, general manager at GE Healthcare Life Sciences Asia. “Our concept is totally different because you do everything in parallel, which enables you to save up to one and half years in total [in design and construction]. This is really what is critical for companies developing biopharmaceuticals: speed.”

image

Once in China, GE will manage the plant’s assembly, validate the equipment and train JHL staff. The completed KUBio facility in Wuhan will have a floor space of approximately 2,400 square meters (nearly half the size of a football field) and will contain a number of 2,000-liter single-use bioreactors.

“Quality has to be built into the process,” JHL’s Jordanov says. “To control a complex process of biopharmaceutical manufacturing requires very sophisticated equipment, and very sophisticated buildings to put the equipment in.”

image

Touching Down on “This Cursed Rock”: First Plane Lands in Napoleon’s Last Exile

$
0
0
3

 

The island of Saint Helena is one of the world’s most remote places. Surrounded by the deep, cold waters of the South Atlantic, the British territory is famous for serving as the final exile of the French emperor Napoleon Bonaparte and the place where he drew his last breath. There are 4,250 people living on the volcanic outcrop, which Napoleon dubbed “this cursed rock.” Their only link to the world has been a five-day ride on packet ship that arrives once every three weeks.

But that’s about the change. Last week, the first plane touched down on the island’s first runway. Workers have also started putting the finishing touches on St. Helena’s very first airport, which will open up the island to tourists and history buffs and break its isolation.

imageThe first plane ever landed on St. Helena in September 2015. All images credits: St. Helena Access Office.

Last week’s touchdown was the beginning of a series of landings designed to calibrate the landing strip, which is perched atop a steep cliff overlooking fierce Atlantic breakers.

image

One partner helping local authorities with the project was AviaSolutions, a unit of GE Capital Aviation Services’ (GECAS). GECAS is one of the core financing units that will remain part of GE after the company’s planned exit from banking.

The airport is scheduled to open in 2016, when the carrier Comair Limited will start weekly flights from Johannesburg, South Africa, with a brand-new Boeing 737-800.

When that happens, the island that kept Napoleon in won’t be able to keep the world out.

imageimageimage

Charles Kenny: Who’s Going to Pay for Sustainable Infrastructure?

$
0
0
Kenny hero-167598282

Everyone agrees on the need for infrastructure investment to drive development, but it will only happen under the right conditions.

In two weeks, a teeming mass of world leaders is going to descend on New York to sign up to the Sustainable Development Goals. Among the targets to be met by 2030 are global universal access to water, sanitation, reliable modern energy, and communications technologies. Back-of-the-envelope calculations suggest that meeting these infrastructure targets would involve a trillion or more dollars in additional infrastructure investment in developing countries every year.

That raises the question: where is the money going to come from? When the world’s finance ministers and aid officials met in Addis Ababa a few months ago to discuss financing the SDGs, they suggested a big role for private investment, leveraged by international support from aid through guarantees. Is this plausible, especially in the countries that need it most? At the moment, investment in infrastructure with private involvement runs at about $180 billion a year to developing countries — less than one-fifth the additional investment needed to meet the SDGs. And poorer countries and regions such as sub-Saharan Africa get a small percentage of that.

Last week, CGD hosted a panel (follow the link for video) to discuss these issues. My conclusion from the event was that money isn’t the big problem when it comes to a considerable scale-up of private investment. To put it another way: if governments can work with international organizations and the private sector to put together bankable deals, the financing will come.

The panel speakers were a balance of private and donor actors: Marianne Fay, chief economist for the sustainable development vice presidency at the World Bank; Jay Ireland, Africa CEO for General Electric; Elizabeth Littlefield, president and CEO of the Overseas Private Investment Corporation; and Tam Nguyen, global head of sustainability for the Bechtel Corporation. It was a group that had a lot to say about the limits to private investment, but also the considerably greater role it could play.

Fay put it bluntly: “The private sector is only willing to come in when it is going to make money.” That is going to limit the sectors and the infrastructure deals that will be of interest. Littlefield suggested local roads were unlikely ever to attract private finance but other parts of infrastructure could — with the right agreements. She pointed to OPIC’s 10 important features of a bankable power purchase agreement, which is first and foremost about ensuring investors get an adequate and predictable revenue stream from their power investments. Ireland suggested something similar: “Generation is the easiest thing to finance” he said, but only “if you have all of the other stuff,” including fuel access, transmission and distribution capacity, and reliable customers to purchase the energy.

It is the “other stuff” that makes deals complex, and that is why the problem at the moment is “too much money chasing too few projects — good, bankable projects,” argued Fay. Pricing is a particular problem: Ireland warned that subsidizing power “chews up the government’s balance sheet.” At the same time, Littlefield noted that now is a great time for pricing reform in electricity in particular because of the very low global price of fossil fuels.

Getting to a solid pipeline of projects would involve a combination of upstream work on pricing and regulation but also feasibility studies and preparatory work. That is what the World Bank’s Global Infrastructure Facility was designed to support, noted Fay. Nguyen mentioned Gabon’s National Infrastructure Project, designed and implemented with the support of Bechtel, which included support to the government to better manage infrastructure projects.

When it came to the role of donors, panelists once again looked beyond the money, with Nguyen suggesting donors’ most important roles were their convening power, their status as neutral parties and their ability to catalyze funding. Fay agreed; even were development banks to double their infrastructure lending, “it’s peanuts” compared to needs. International financial organizations and development finance institutions should focus on growing the pipeline of bankable projects, developing alternate sources of finance like local capital markets, and leveraging through tools including guarantees that cover sovereign risk to infrastructure projects. Ireland added that some investors need to better understand that “if you get guarantees, the private sector shouldn’t be asking for a 23 percent rate of return.”

The panel also discussed constraints to World Bank and OPIC support for infrastructure, including, in particular, opposition to fossil fuel and large dam projects from Western stakeholder groups. Littlefield and Fay both suggested there were obstacles, though both noted that hadn’t stopped them playing some role. And Ireland warned that if developing-country governments “can’t do it with this crowd, they’ll go somewhere else.” As it happened, the day after the panel, Kenya signed a deal with China about training and technical support for a nuclear power plant to be built in the country by 2025.

With a growing number of development partners to work with and greater private-sector interest in emerging infrastructure markets, the future could be bright for private infrastructure investment in regions including South Asia and sub-Saharan Africa. But if the deals are to be financially and environmentally sustainable as well as successful for development, getting prices and project preparation right is key.

(Top image: Courtesy of Thinkstock)

This piece first appeared on the Center for Global Development’s blog.

 

Charles Kenny is a senior fellow at the Center for Global Development.

Ian Bremmer: 5 Patterns Disrupting the World

$
0
0
Bremmer hero-458235405

Five forces are shaping political risks, from climate change to conflict. Here’s what to look out for.

We see patterns everywhere — in nature, in physics and in the world we’ve created — economic booms and recessions; market spikes and crashes; social stability and revolution. But I’ve never accepted George Santayana when he said, Those who cannot remember the past are condemned to repeat it.” Recognizing patterns is one thing. Thinking they’re repeating themselves, that’s something else entirely.

When I think about global political risk going forward, I see five types of patterns we should watch for. They create very different types of disruptions

  • First, patterns that occur consistently but are trending negatively, ultimately leading to crisis.
  • Second, singular long-term events that hit with tremendous force.
  • Third, patterns that bring greater and greater impact.
  • Fourth, sudden events that happen so infrequently they aren’t considered patterns…until one transpires.
  • Fifth, patterns that are speeding up, eventually happening so frequently that there’s little chance to prepare for or react to them.

The first three patterns we see coming, but for various reasons (incrementalism, collective action problems, the daunting nature of the challenge) tend not to be effective at planning for. The latter two are generally unrecognized until they become truly disruptive. Let me consider all five in turn.

Pattern 1 – Ever More Negative

Most of the world’s long-term social, economic and political factors are trending favorably — expansion of global education, reduction of poverty, improvements in the overall quality and duration of life, decline in the cost of commodities. As much as globalization has its disenfranchised (and, accordingly, its discontents), its overall long-term record remains strongly positive.

But I see two areas that are particularly dangerous. One is climate change: both global warming and more extreme climate conditions. Climate change skepticism has been sufficiently rebuffed that global warming is now near-universally recognized as one of the world’s most important challenges. But the science around its severity — given feedback loops in particular — is extremely complicated, making it difficult to reliably assess costs of mitigation. There, the data are open to multiple interpretations. It’s easy to obscure both the urgency of the problem and the utility of potential solutions — both useful avenues to vested interests aiming to stall, as they would be harmed by a move away from the status quo.

And, of course, climate change has become the world’s largest collective action problem, requiring coordination by a wide swath of actors that don’t tend to cooperate, especially not on matters fundamental to their economic well-being. Most problematic is that the people that will get hurt most in the near to medium term have little power (think small island nations). So there’s no incentive for the big players to move quickly.

That explains why little has been done at the state level. But it’s also true that non-state actors can (and increasingly will) play a role in transforming social response — in the energy field in particular. Further, climate change is playing out over a long time frame, with a logic that becomes increasingly compelling even as the costs are extremely high. It’s banal to say things will get much worse before they get better. But there’s also reason to believe that the biggest actors will eventually take the issue very seriously indeed (even as that comes too late for the most disenfranchised).

The certainty of further procrastination is the biggest concern around climate change, particularly given the coming impact of more extreme climate conditions on the least developed and most populated areas of the world, which are likely to experience far more violence and strife as a consequence. It may well be what ultimately stops Africa from developing (and, ironically, prevents the end to the demographic explosion on the continent, worsening the problem). That will lead to greater refugee flows and radicalism, and in turn create more extreme forms of governance both in the regions directly affected as well as through the reactions of those that aren’t.

A second significant negative trend is labor utilization. This is the issue of inequality which, of late, has been increasing in developed markets, where the world’s upper-middle classes have been hollowing out, largely at the hands of the global emerging lower-middle class (with cheaper productive labor) and the world’s upper class (with more efficient and disruptive technology).

If that continues and expands globally, it will erode the world’s emerging middle classes too, which are far less well insulated than the developed world’s against falling into severe and dangerous poverty — in countries far less insulated against social instability. French economist Thomas Piketty identified only the beginnings of this problem, most specifically felt in Europe: populism and xenophobia, eroding European institutions and lower growth. As the trend continues, it threatens the foundations of modern governance. Established middle classes have proven over history to be the world’s most potent stabilizing force. Insecure middle classes have more recently shown themselves to be powerfully reactionary…

Is this trend likely to continue? Absolutely. The good news is that it’s predictable and playing out over a sufficiently long timeframe that governments have the ability to effectively react. Some developing governments will use the pressure to improve efficiency and transparency, and in so doing redistribute wealth for more sustainable transitions. Others will turn towards authoritarianism. This will have greater consequences for global stability over time, because of the increasingly outsized impact that emerging markets will have for global growth as well as overall economic output.

Pattern 2 — Catastrophic Downturn

During the Cold War, it was the arms race. Massive nuclear arsenals were built out of an unyielding urge to prevail in competition without heed to the broader dangers posed to society. Utterly senseless in such a basic human way, yet completely calculated and rational at the same time. The big risk was the possibility that brinksmanship would get out of hand (most dramatically experienced during the Cuban missile crisis) and we’d end up in a thermonuclear war. It was unthinkable and yet…it could’ve actually happened.

Despite the fact that Americans and Russians still maintain the nuclear firepower to destroy civilization (though only about one-quarter of what both sides had at the Cold War’s peak), there’s not much threat of nuclear annihilation today. Fragmentation of the geopolitical environment, Russian decline and American indifference mean that the United States and Russia no longer threaten each other over every parcel of land. And while relations between the two are poor indeed, that’s playing out primarily in political and economic arenas, not as global military conflict. There’s a thin tail possibility of fighting over a NATO country that could cause military brinksmanship (say a Russian incursion/invasion in the Baltics). but I’d hardly bet on that.

Instead, I’d worry about cybersecurity. The destructive force of offensive cyber capabilities is expanding, while the number of actors wielding them is proliferating rapidly. Defenses are out of date even before they’re constructed. And the gap between the United States and other actors in offensive capabilities is diminishing.

Over the long term, this is fixable as long as it stays between state actors, for two reasons. First, most state actors are bought into the existing system, and therefore interested in relative advantage, not destruction for destruction’s sake. We’ll see more industrial espionage that “levels” the global playing field (and hurts the comparative interest of Western multinationals, reducing their profitability), but less development of so-called “integrity” attacks that seek to destroy core elements of other actors’ livelihood. Development of capabilities in the latter just leads to more mutually assured cyber-destruction, where core elements of infrastructure and other elements of national security for all actors are intrinsically vulnerable to the actions of the other side. While as regards economic attacks aimed at gaining comparative advantage, we’ll see more assertive deterrence/tit for tat.

But as cyber-capabilities move into the hands of non-state actors, integrity attacks become more likely. Most recently, the Ashley Madison hacks destroyed the viability of the company, and have the potential for meaningful disruption among powerful but data-vulnerable elites. These types of attacks come from small private actors that are impossible to dissuade in the present environment. As that trend continues, we’ll see governments taking a more significant role in coordinating their defenses (and punishments), creating more effective transnational cyber-regimes. But government mistrust and Internet fragmentation are obstacles to this trend, leaving room for things to get much worse before they get better. In the near term, multinationals are still strongly underspending on the risks posed in this environment; more so than, say, “sustainability” efforts on climate.

Pattern 3 — Increasing Impact

Uncertainty from a variable of ever-increasing importance: this is fundamentally the China story. China’s stock markets are almost wholly speculative, politically constrained and unmoored from the real economy. The Chinese government’s desires to transform its country into a more market-based and consumer-driven economy has led to greater experimentation and openness…which in turn has allowed for bigger short-term bubbles. None of that is breaking news; and none has a significant impact on China’s economic stability. But global markets were sent into turmoil on China’s recent market crashes, with many (including the United States) experiencing greater volatility than at any point since the 2008 financial crisis as a consequence.

Why? Because China is now the second-largest global economy, and well on the way to becoming the first. Sudden political intervention in China is nothing new; nor is near-complete opacity in Beijing decision-making. But market-moving impact owed to the fact that China is a larger trading partner than the United States for every single country in Asia…and that its decisions on commodities purchasing and reserves holding are what can push some of the world’s most important economies into or out of recession — that’s something the world hasn’t experienced before.

All of this is going to occur at the same time that China is itself attempting to make a politically unprecedented transition from a state-investment directed economy towards a consumer-driven one…while maintaining a single-party system of political governance. If there’s one pattern I’d bet on to dramatically change the way we think about the world economy in the next 20 years, it’s the heightened volatility (and lower quality of global growth) that comes alongside the continuation of China’s rise.

Pattern 4 — Rare, But Happening Now

This is the heart of the world’s geopolitical challenge. Creative destruction occurs in the geopolitical environment, but only very infrequently — the last time was the emergence of the U.S.-led global order after World War II. That allows us to believe the international system is more stable than it actually is. But we’re now at the beginning of just such a period, where we are moving from the longstanding U.S.-led global system to…something else. It’s not yet clear what that something is: regionalization; a more multilateral global concert; a G-2 between the United States and China; or an entirely different international order. But the old system isn’t holding.

That’s going to prove a serious challenge — the rise of emerging markets broadly, the rise of China more specifically, the decline of U.S. foreign policy influence, even as the United States itself is decidedly not in decline. Together with (and in some ways related to) more unilateral approaches on the part of both the United States and its allies, as well as the greater unilateralism of 21st century tools of coercive diplomacy (drones, cyber, the weaponization of finance) — all of which undermines the Americanization of the global system.

That leads to a growth in geopolitical conflict in many areas of the world. Most immediately in the Middle East, where many authoritarian systems come under pressure and become even more repressive — or become failed states — and where there is more fighting between countries that have mutually incompatible ideas for how their respective region should be governed/controlled (Saudi Arabia versus Iran being the most obvious, but by no means the only). Russia becomes more of a revisionist actor that seeks to subvert American gains, particularly as the Russian economy comes under greater pressure. And it means more regional security tensions in Asia as well, with the rise of China.

As with pattern 3, the biggest issue here is China. We’ll see a growing challenge to U.S.-led global standards. This will make the global system less efficient, leading to fragmentation in the global free market, the World Wide Web (and perhaps the Internet of Things, an interesting question), and eventually the role of the U.S. dollar. It will ultimately bring a transfer of wealth away from the United States. The question is how the United States will respond to that rebalancing, and to what extent we are likely to see fundamental conflict as a consequence of that dynamic.

It’s not clear. The desire to demonize China is emerging as a 2016 electoral issue, in part as a reaction to demand for populism and American “leadership” (when “make America great again” is a core slogan, you can’t very well sit by and watch China become the largest economy…). But i’d argue China is still a sideshow for American candidates compared to the more tangible immigration fight and, most importantly, directly domestic economic issues. Still, it’s likely to grow, especially as China itself gets more assertive. Neither the United States nor China has any interest in destroying the existing international system. But China is prepared to take risks to improve its position. And over time, I suspect the United States — to sustain its own status — will be as well. So there’s an intrinsic danger here.

Medium term, I find this to be one of the most significant global challenges — that the Americans and Chinese prove unwilling and therefore unable to accommodate their changing balance of power in Asia and (even more difficult, in my view) beyond; which could mean their present frenemy status tips into active zero-sum confrontation in economic relations, while relations deteriorate on the military front too. That creates a new Cold War with the world split into U.S. -led and China-led blocs, and balancing between them becoming ever more challenging for third-party states. It’s a dangerous security environment for sure, and a particularly problematic environment for global growth.

The good news is that it’s unlikely to persist for long. Within two decades, China’s own core challenges get far greater — both domestically, given demographic, environmental and social/political reform trends; and internationally, as India catches up and the world becomes much more multilateral. The bad news is it’s unclear how China reacts to those pressures when they occur. And at that point, as I’ve mentioned, China has become the world’s largest economy, so the volatility implications on the global economy will be severe. That’s the most definitive risk that comes from geopolitical creative destruction.

Pattern 5 – Faster and Faster

This is the acceleration of technology and its disruptive impacts. The destruction of privacy is interesting on this front. It’s something everyone sees happening, but the erosion is occurring at exponential speed, quickly outstripping political discourse, public outrage or the prospect of any potential response. Looking forward over the next decade or so, if quantum computing comes into existence, suddenly it puts an end to anybody (person or institution) being able to maintain anonymity.

Artificial intelligence is a related area. Stripping human agency over political, economic and social systems creates technological processes that may not easily be harnessed. At worst that’s the singularity or the “grey goo”/nanotechnology scenarios that end humanity as we know it — and I can stop writing updates altogether. But well short of that, AI could change the nature of individual life — super-empowering citizens against governments in a way that makes them impossible to govern, or super-empowering institutions (private or public sector; more likely both) in ways that create a much more authoritarian dystopias.

These are the most discomfiting patterns, because they’re the most disruptive. It’s hardest to identify patterns when the processes that create them are becoming ever-faster and more diverse. The optimistic view: history tells us that far more trends will be productive than destructive, creating evermore wealth and growth. The pessimistic view: it only takes one outlier to upset the apple cart.

(Top image: Courtesy of Vladmax, iStock Editorial)

This piece is based on a Eurasia Group post.

Ian Bremmer is the President and Founder of Eurasia Group.

Breaking New Ground: Digital Twin Helps Engineers Design Megawatt-Sized Circuit Breakers

$
0
0
Breaking New Ground: Digital Twin Helps Engineers Design Megawatt-Sized Circuit Breakers 0

We’ve all stood in the dark at least once after getting tripped up by power-hungry appliances. Typically, the remedy is just steps away: a quick flip of the circuit breaker switch, and you’re back in business.

It’s a simple fix, but it involves complex physics. “Circuit breakers protect our homes from electricity overload,” says Tim Ford, senior product manager for industrial circuit breakers at GE’s Industrial Solutions business. “This sounds easy, but the amount of energy they are often called on to dissipate is like grabbing the flywheel of a running car and stopping it.”

Ford should know. His team builds breakers that can disconnect a small power plant. Their latest device, called the GuardEon Molded Case Circuit Breaker, will be able to dissipate 2.7 megawatts. That’s enough horsepower to stop seven Porsche 911 Turbos cold in their tracks – if you could fit them inside a shoebox. The breaker is unusual since the team used powerful software for the first time to build a virtual prototype of the device – its “digital twin” – and tested it inside a computer.

After they ran the digital twin through tests inside the computer, they exposed a real-world GuardEon prototype to 100,000 amps at 480 volts. The breaker survived the short circuit. Image credit: GE Energy Management

It’s a numbers game taken to new extremes. The circuit breaker must withstand an electric arc – essentially a lightning-like discharge – that can reach temperatures as high as 19,500 degrees Celsius (28,600 Fahrenheit). That’s more than three times the temperature on the surface of the sun. In addition, the circuit breaker must withstand pressure of 17 to 20 atmospheres, the equivalent of diving 660 feet below the surface of the sea.

The breaker must also control the molten metal particulates created by the arc and handle electromagnetic forces reaching as much as 5 tons – the weight of an African elephant. All of this happens in less than a second in a volume smaller than the cavity of a microwave oven.

imageAnother GuardEon prototype just before a short-circuit test. Image credit: GE Energy Management

“When we design these devices, we can’t just pull out the circuit breaker design 101 book from college and look at formulas because the formulas don’t exist and that book is far from being written,” Ford says.

That’s why Ford’s business partnered with software engineers at GE Global Research Center and the University of Connecticut and designed GuardEon inside a computer. They’ve been using a customized version of the commercially available software ANSYS to build the “digital twin” that will enable them to study the effects of design changes with a level of detail that has been impossible to achieve through physical sampling and testing.

imageGE engineers in Plainville, CT, are using software to model and test GuardEon’s digital twin. Image credit: GE Energy Management

The team can use the model to simulate the electromagnetic, mechanical and fluid dynamic aspects of circuit breaker behavior and study their interplay. The preliminary use and adaptation of the “digital twin” also allows them to reach higher performance levels and move faster in bringing the device to the market.

“Multi-physics-simulation modelling helps us narrow down 10-15 different designs into three or four that we can use for physical testing and validation,” says Dhirendra Tiwari, principal engineer and technologist at GE’s Industrial Solutions business. “Without these capabilities, we’d have to send all of the samples to the lab for development and testing and then go from there. This is expensive and time consuming.”

This is not the first GE digital twin. The company is already using the approach to design more efficient wind turbines and even entire wind farms.

Ford says that “although the circuit breaker will launch without actual operating experience – and there’s no way around that – we have analyzed it hundreds of times in a virtual environment to find and eliminate inefficiencies and potential weak points that would not have revealed themselves during laboratory testing. That’s pretty cool.”

Lather, Rinse, Repeat: This Solution to Climate Change Could Be Hiding in Your Bathroom

$
0
0
2

 

image

One way to reduce greenhouse gas emissions and slow down climate change is to cease burning fossil fuels. Sounds easy, but such a sudden stop would likely plunge most of today’s world into darkness and send some of the biggest and fastest growing economies off a cliff. The reality is that coal-fired power plants, the biggest emitters of CO2, are not going away anytime soon, not in the U.S. and especially not in countries like China and India which already burn half of the world’s coal and are leading builders of new plants.

That’s why scientists around the world are looking at the next best thing and developing new kinds of traps to stop carbon from escaping through the smokestack. GE’s Phil DiPietro and Bob Perry have been experimenting with a family of promising materials called amino silicones, commonly found in bathrooms and laundry rooms in hair conditioners and textile softeners. “Although they are in the same family, I wouldn’t recommend washing your hair or your laundry with the amino silicones we’ve developed,” laughs Perry, a chemist at GE Global Research in Niskayuna, NY, who spent the last decade developing the technology. “They’re specially formulated to scrub carbon.”

So far, the materials have been up to scratch. The U.S. DOE is holding a competition for developers of CO2 capture technologies. The prize: a chance to test your concept at a scale equivalent to a 10-megawatt coal-fired power plant. GE has passed the first phase and is now in the mix with five other developers to compete in the second leg.

imageTop: Positive results allowed Perry and his team, including from the left technologists Sarah Genovese, Rachel Farnum, Tiffany Westendorf, to scale his research from a test bench (in the background) to power plants simulators. Above: GE is testing its industrial amino silicone CO2 scrubber in Alabama. Images credit: GE Global Research

The DOE will pick two finalists from that round, who will get a chance to prove their technology at the world’s largest industrial-scale CO2 capture test facility in Mongstad, Norway. The $1 billion site will allow the teams to simulate the output of a 10-megawatt coal-fired power plant and use amino silicone to capture CO2 coming through the smokestack. “This is the big test,” says DiPietro, technical manager for CO2 capture and separation at the Oil and Gas Technology Center in Oklahoma City. He says that the Norway test would require 80 tons of amino silicone solvent and capture as much as 1,600 pounds of C02 per hour.

That would be big step for Perry and the team, who started a solving the problem of CO2 capture a decade ago with just beakers in his chemistry lab. “We’ve started moving pretty fast,” he says.

imageGE’s Bob Perry. Image credit: GE Global Research

Perry says that amino silicones work like a conveyor belt. They efficiently glom on to CO2 gas at about 105 degrees Fahrenheit (40.5 Celsius) and release CO2 after the mixture is heated to 250 F (121 C). The system then cools down the material and it returns to trap more gas.

Unlike conventional carbon capture methods, Perry’s process doesn’t need any water. “This is where the money is,” he says. “If you need to boil water to drive off the CO2, you are facing an enormous energy drain. The existing technology will increase your cost of electricity by 80 percent. You almost have to build a plant that’s twice as large to power the scrubber and still send some electricity to the consumer.” Perry and his team have filed several patents for their method.

imageThe “big test” will take place at the $1 billion Mongstad facility. It will require 80 tons of amino silicones. Image credit: GE Global Research

They also designed an amino silicone molecule that’s large and heavy and doesn’t escape from the smokestack. “Our solvent is really big with high molecular weight,” he says. “Its size keeps it in the process.

The U.S. Department of Energy is running these tests because it wants the future cost of CO2 capture to be no higher than 35 percent of what it costs now. Perry says that “anyone who can get under 50 percent might be doing really well.”

Besides coal-fired power plants, Perry and DiPietro are already looking at “near-term” applications at cement plants, steel mills, small power plants and other CO2 emitters.

What about the gas? Perry says that it can be used for oil and natural gas extraction. DiPietro says that injecting CO2 down an old oil well could “yield 15 to 25 percent of the oil” originally pumped out. The gas could also help farmers grow plants. In the Netherlands, for example, farmers are using a CO2-enriched atmosphere to enhance the growth of vegetables and tulips. Now, that’s a green technology.

London Calling: Ex-Im Shutdown Prompts GE to Look Elsewhere for Export Financing

$
0
0
LOndon

GE has signed a new export deal with the UK government that could create as many as a thousand jobs in the country.

Today’s announcement comes on the heels of a similar agreement last week with the French export credit agency COFACE that could create 400 jobs there, making this GE’s second ECA agreement with a foreign government lender since Congress failed to reauthorize the U.S. Export-Import Bank in June.

Over the past weeks, headlines in the United States, Europe and Asia have touted a series of new agreements on global trade as GE and other companies have been seeking to blunt the impact of the U.S. Ex-Im Bank’s lapse in operations. The U.S. Congress did not renew the Ex-Im Bank, as it is commonly known, at the end of June. Since then, the Bank has been unable to provide new loans, making the United States the only major industrial country to operate without an export credit agency.

tumblr_inline_nv6zx5jHGB1qzgziy_540

While U.S. companies continue to urge Congress to renew the Bank, many have been forced to pursue alternative financing for their global customers or risk losing business. In addition to GE’s announcements last week, Boeing and Orbital Sciences Corporation, a Virginia-based satellite company, have reported lost satellite deals resulting from a lack of Ex-Im financing.

Today’s agreement with the UK export credit agency UK Export Finance (UKEF), will unlock $12 billion in financing for UK-manufactured exports. The agreement will support both confirmed and potential orders in a number of international markets including Brazil, Ghana, India and Mozambique – markets that either require ECA financing or where such financing is critical to secure a competitive advantage.

GE estimates that winning these orders will help it create up to 1000 new jobs in the UK in the energy sector.

imageTop: GE’s H80 turboprop engine inside a testing cell in Prague. Above: The blades of an aeroderivative turbine. Images credit: GE Reports

“We are doing everything we can to make Britain the best place in Europe to start, finance or grow a business,” said British Prime Minister David Cameron. “GE’s substantial commitment through this agreement is fantastic news. It will provide jobs and security for people working in the energy sector and elsewhere. It is a vote of confidence in our long term economic plan.”

GE is already one of the leading investors into the UK, investing over $21 billion dollars in the UK since 2003. As part of the deal, UKEF has added GE as a member of its Direct Lending Facility Partnership Panel, which will allow the company to provide technical, commercial and financial solutions to its customers.

imageGE Aviation already makes in the UK wing components for Airbus A380 (above) and A350 aircraft. Image credit: Adam Senatori/GE Reports 

Prior to today’s UKEF agreement, GE Chairman and CEO Jeff Immelt visited Paris to formalize GE’s agreement with COFACE, which could ultimately create 400 jobs at GE’s facility in Belfort.

GE also announced last week that it would move 100 jobs responsible for final assembly of aeroderivative turbines from the U.S. to Hungary and China to ensure customer access to Export Credit Agency (ECA) financing in those countries. Additionally, last week, GE Aviation announced that it would create a $400 million turboprop engine development, test and production operation in Europe that could ultimately support between 500 to 1000 jobs.

GE has been in close talks over recent months with various ECAs in order to secure funding for its customers. The company made it clear that it would expand its operations in markets where ECA financing is available based on a lack of ECA financing at home.

“In today’s competitive environment, countries that have a functional Export Credit Agency (ECA) will attract investment,” Immelt said. “Export finance is a critical tool we use to support our customers. Without it, we can’t compete against foreign competitors who enjoy ECA financing from their governments.”


A Scientist Walks Into the GE Store: Sharing Ideas Helps Engineers Leapfrog Competition

$
0
0
A Scientist Walks Into the GE Store: Sharing Ideas Helps Engineers Leapfrog Competition 0

The first GE research lab opened in a barn behind a scientist’s home in Schenectady, N.Y., in 1900. Three people worked inside the wooden structur before it burned down a year later.

It was an inauspicious beginning for one of the largest corporate research institutions in the world. GE Global Research now employs 3,000 people and runs nine labs in the United States, Brazil, China, Germany, India and Israel.

Over the years, the labs have employed several Nobel laureates and developed breakthrough technologies like LEDs, brain MRI and new ceramic composite materials called CMCs for next-generation jet engines.

But they don’t keep the patents for themselves. The scientists share their insights with an army of 47,000 engineers working inside GE businesses: from healthcare to oil and gas and aviation. The real payoff comes when they can use the same technology, say, CMCs, to build a better jet engine as well as to improve on a gas turbine. Mark Little, who runs GE Global Research, calls this approach the “GE store”.

imageTop image: Parts made from ceramic matrix composites (CMCs), like this turbine blade, will have applications inside jet engines as well as gas turbines. Above: In 1900, the GE store fit inside a barn. GE Global Research now employs 3,000 people and runs labs around the world.

One of the best examples of this technology interchange is GE’s latest Evolution Series diesel-electric locomotive called Tier 4. (It is the first locomotive that meets the U.S. government’s strict Tier 4 pollution limits.)

The locomotive’s power, fuel and exhaust systems, turbochargers and other technology combine contributions from six different GE businesses. GE engineers used them to reduce NOx emissions by 76 percent and particulate matter emissions by 70 percent, compared to previous models. The train engine could also save customers $1.5 billion in expensive infrastructure changes they would otherwise have to make to meet the new EPA regulations.

Once you start looking under the hood of GE machines, you can find the GE store everywhere. The company’s fleet of mobile power plants, for example, uses technologies originally developed for jet engines. The wind business has been looking at superconducting magnets developed for magnetic resonance machines to maximize electricity output. GE CT technology can probe the the brain as well as aircraft parts and pipelines.

“The business of research is not the business of Eureka moments,” Little says. “It’s the business of planning, strategic approaches to things, hard work, and patience.“

The GE store itself is an innovation that might spread. “In the university we talk a lot about collaboration [and] discovery through bringing together disciplines,” says Yale biologist and Nobel winner James E. Rothman, who as former chief scientist at GE Healthcare still visits GE labs in Schenectady. “I have never seen it work anywhere as well as at GRC… That sort of non-quantifiable knowledge has a way of leveraging [itself] across the whole of GE.” Take a look at our videos with scientist explaining the GE store and technology applications across different GE businesses.

Smart Streets Are Made of These: San Diego Deploys America’s First Intelligent Lighting System

$
0
0
SanDiego

The denizens of the world’s sprawling megacities all face similar daily challenges: traffic, busy sidewalks, packed puclic transportation, no available parking. “Urbanization is coming at us like a freight train,” says Rick Freeman, global product manager for intelligent devices at GE Lighting. “The same old ways are plain going to fail. We have to get ready.”

There’s no time to spare. McKinsey & Co. reported in June that more than half of the world’s population already lives in cities and that the figure will grow to 60 percent by 2014, swelling urban areas by 1.4 billion people.

Our cities have to become intelligent if they’re going to thrive in the midst of this shift. “Intelligent lighting” systems, for example, could help the cities in the future reduce congestion, free up parking spots, find ideal locations for new bike lanes, give the police and paramedics real-time views of parks and neighborhoods and send environmental alerts, Freeman says. Two intelligent lighting pilots, in San Diego and Jacksonville, are already gathering data. “Having all that knowledge
gives us insights we never had before,” Freeman says.

Top image credit: Image by Kelsey Montague

The project in downtown San Diego involves a system GE calls “Intelligent Environments for Cities.” It’s an example of GE connecting machines – in this case LED street lamps – to the Industrial
Internet
, a network that links devices with software analytics and the data cloud.

Lighting started working with San Diego in 2014, when it installed a wireless system called LightGrid to remotely assess and control 3,000 streetlights. The city is saving more than $250,000 annually in electricity and maintenance costs as a result.

image

San Diego is testing GE’s Intelligent Environments for Cities system in the Little Italy and Gaslight neighborhoods. Image credit: GE Lighting

The new downtown pilot is the first intelligent lighting application in the U.S. It involves fixtures located in the Little Italy and Gaslamp districts. The lights are equipped with sensors and computer-vision software that can pull parking and other data for real-time analysis into Predix– GE’s cloud-based platform for the Industrial Internet. “San Diego is switching on the potential of our streetlights in ways that seemed improbable just a few years ago because of GE’s Intelligent Cities technology,” said David Graham, deputy chief operating office of San Diego.

image

3,000 “intelligent” street lights from GE are already helping San Diego save $250,000 per year. Image credit: GE Lighting

Officials are now studying data trends from the new pilot to determine the best benefits for the city and its residents.

The potential applications for the system are endless. The data could help software developers build
effective parking apps that would lead users to an empty spot and allow them to pay for it from their smartphones.

Freeman envisions a future where you could make a dinner reservation, find the fastest way to get to the restaurant and book a parking spot with the same app. “The cost of computing continues to fall along with the amounts of electricity needed to power the sensors,” Freeman says. “At the same time, wireless mesh and WiFi networks are emerging across cities. The convergence of these things makes this space a really exciting place to be.”

Just last week, for example, the White House announced that it would invest $160 million in a new “Smart Cities” initiative. The money will use new technology partnerships to help communities with their most pressing challenges, according to a White House press release.

That’s smart idea.

Aneesh Chopra: Startup Government

$
0
0
Magic book on hand

Beneath the noise of the Obamacare rollout, a quiet revolution was taking place in how government delivers digital services.

Over the past year, Uncle Sam sought to scale up the approach to providing digital government services that was utilized during the recovery of healthcare.gov– one that focuses on the customer, uses agile practices and empowers one leader to deliver results (among other “plays”). It has done so through a recruiting binge that has attracted some of the brightest engineers from companies like Amazon, Twitter, Google and Facebook to the government.

But President Obama’s “Stealth Startup” won’t stop at merely improving the government’s capacity to create websites, as much as that is a necessary endeavor. Rather, its full impact will be felt through opening those digital services to developers as a wholesaler, whether to ease tax filing or sign up for healthcare.gov. While the consumer experience on the official government healthcare site has already dramatically improved, it was purposefully given competition with the aim of making it even easier for consumers to address their top concern — finding a plan that is right for them. That is thanks to some of the more than 70 “web broker entities” that are now certified to help streamline the process.

Take, for example, the transportation service Uber. In 2014, 60 percent of surveyed Uber drivers said they wanted tools to, “simplify the selection of health insurance.” Rather than simply direct their drivers to the now-functioning healthcare.gov, Uber went further —partnering with Stride Health to offer personalized cost estimates based not just on driver information, but also on open government data, such as plan information from healthcare.gov, Census and other data from the Commerce Department, as well as population cost benchmarks from Health & Human Services Department. Stride’s innovation was to demystify the process by stripping away the complex industry jargon and factors related to copays, deductibles and premiums — and instead presenting just a single cost estimate.

Over time, these services will grant consumers access to product recommendations, which the government isn’t generally in the business of providing. An exploration of the Medicare Part D drug plan shopping portal illustrates the challenge. The government site provides an exhaustive list of options, relying on the shopper to pick the plan, perhaps based on the drugs they already take or might take based on their chronic condition. A recommendation would be a valuable addition, because busy consumers don’t always have the time, inclination or aptitude to fully vet options and determine which is best for them. This is evidenced, in part, by Medicare beneficiaries leaving an average of $500 on the table when selecting prescription drug plans.

The key will be to motivate startups to create recommendations engines or other services that energize and simplify the selection process. To that end, healthcare.gov has enabled developers to earn broker commissions, when operating with a broker license, from the insurance companies for selling plans.

Developers will also benefit from an expansion in the amount of available, useful data. Next year, insurance companies will be directed to open up their provider directories and drug formularies in a format that a recommendations app and others can use. That should compel new features that make it easier for consumers to keep their preferred doctors.

As I argue in “Innovative State,” we are in the midst of a new paradigm in the public sector — one that is both pro-growth and pro-government. It is initiated by a series of handshakes between opposing political entities regarding the rules of the road, and then propelled through handoffs to a growing movement of entrepreneurs and innovators who build new products and services. The manifestation of this vision is the public/private marketplace, optimizing three policy levers:

  1. Technical Standards: That consumers can safely and securely enlist third-party apps to perform digital services via open APIs;
  2. Consumer Protection/Regulatory: That such apps adhere to voluntary enforceable codes of conduct, including basic consumer protections on privacy and fiduciary advice;
  3. Policy/Business Model: That such apps share in the savings they help generate due to better outcomes — from reduced time on unemployment rolls, to increased retirement security, to lower student loan debt rates, and so on.

An Innovative State will make it less necessary to trek to a government office, or even to visit a traditional .gov site. It will allow citizens to more frequently look toward the private or social sectors, and the options they can offer. In other words, government services will come to us, in context, rather than the other way around.

Imagine a veteran planning for her next career on a talent platform that is connected to her post-9/11 GI Bill benefits to recommend the highest quality approved training program that builds on her skills to qualify for the job that is right for her.

Imagine a Gen-Xer couple jump-starting their retirement planning with a financial advisor who starts the conversation with full access to their social security account data — so he can focus offering advice, rather than pestering them for data required for a litany of forms.

Imagine a Millennial completing her student loan application through a service that simultaneously matches all of the known scholarships for which she might qualify without requiring her to fill out another application.

Thankfully, the foundation for this movement towards public/private marketplaces is being laid, albeit quietly. We may never fully unclog the gridlock in Washington but we don’t need to get stuck in all that traffic. The Innovative State can serve as our bridge to a better government — and a better country — if we grab the reins and organize the public, private and social sector stakeholders to bring it to life.

(Top image: Courtesy of Thinkstock)

 

Aneesh Chopra is Co-Founder & Executive Vice President of Hunch Analytics. He was previously the first U.S. Chief Technology Officer.

A Sense of Wonder: Photographer Vincent Laforet Tapped His Inner Child When Shooting Locomotives From High Above the Colorado Prairie

$
0
0
A Sense of Wonder: Photographer Vincent Laforet Tapped His Inner Child When Shooting Locomotives From High Above the Colorado Prairie 0

In March, GE invited the Pulitzer Prize-winning photographer Vincent Laforet to a remote locomotive testing facility spreading over hundreds of acres of shrubby prairie near Pueblo, Colo. Laforet hired a helicopter and produced stunning images of GE’s blue Tier IV train engine while circling over the moving machine. After the shoot, he sat down with GE Reports to talk about shooting locomotives, climbing skyscrapers and dialing into childhood memories to capture their sense of wonder.


image

Tomas Kellner: How do you shoot a large object like a locomotive that is moving very fast in and out of frame?


Vincent Laforet:
Trying to photograph a moving train is either extremely easy or extremely difficult. There’s nothing in between. Trying to get a helicopter to fly in tandem with a massive locomotive followed by a dozen cars is a little bit of a feat. But with the right team members and the right kind of attitude, we were able to pull it off very nicely. We also had a lot of concerns about weather since we were shooting towards the end of the winter.


TK:
The sunrises and sunsets in the pictures are gorgeous. It seems like the weather played along.


VL:
We actually had to pull the shoot back one day early, which made it very difficult for our team because we had a shoot the night prior in New York, so it was a very compressed schedule. But we were lucky because the next day there was an incredible snowstorm; the sky was all white and full of snow.


image

TK: The locomotive looks almost like a toy in your pictures. Why did you choose that effect?


VL:
Whenever I get an assignment, I try to get to the bottom of what it’s about. In this case, I remembered what it was like being a little child playing with train sets. That’s one of my first memories. As a teenager, an entire table in our basement was filled with a train set. I tried to do my best to capture that sense of wonder.

Because when you’re in the real world, with adults, you see the scale of the locomotives and see what they do and how they are important for moving goods and the economy. But when you’re a kid, you’re in awe of something like a train that effortlessly wanders across the tracks in circles. I found it absolutely meditative to sit back and watch that movement.
As a photographer, I’m trying to dial into those memories, those ideals, and capture that.

imageLaforet still photographs New York City. Above is the Chrysler Building.


TK:
When did you start taking pictures?


VL:
My father was a photographer. When I was 15, I picked up the camera and haven’t put it down ever since. My favorite first memory is walking with it through NYC with a few rolls of film in my pocket and feeling the sense of discovery.


TK:
You succeeded. From above, the Pueblo test track does look like a giant train set. What were some of you other ideas for the shoot?


VL:
We thought of several ways of photographing the train, including one with a cherry picker. The problem is that you can’t get much variety, you can’t move it around, and the terrain didn’t work either. The helicopter was the best choice.

imageAbove: The helicopter got so close that it blew tumbleweed in the train’s path.


TK:
I’ve been on helicopter and it’s usually pretty bumpy. How did you keep you camera still?


VL:
We’re used to shooting at night and at high altitude with a gyroscope. It’s our specialty. This assignment was actually very smooth, there was very little wind and we had a phenomenal pilot. I would describe the helicopter as a magic carpet ride. With the right pilot in the right situation, it’s actually extremely smooth and relaxing. Having a train that moves at a predictable speed on a predictable path is the best-case scenario you can ask for.

We took off early in the morning dusk, before sunrise. At that point we had gyros to stabilize both cameras. They eliminate most of the vibration from the helicopter. That’s how you able to see some of those images, in one you can actually see the headlights of the train, which is pretty cool.

I got rid of the gyros as soon as I could because they’re pretty heavy and cumbersome and went straight to handheld. And that’s where I was able to do the longer zoom stuff from the higher altitude.

imageimageHelicopter image credit: Chris New


TK:
Were there any other challenges?


VL:
The problem with using the helicopter was that the train was not tall. You’ve got to get the helicopter quite low to make it be in the forefront of the frame. So we needed to do some scouting to make sure that it was safe to fly low. If there were tables or antennas or any sort of problems, we were not going to take the risk of having an accident.
We found several parts of the track that were actually ideal for flying very low and being able to land immediately, should anything happen.


TK:
The light looks stunning in the pictures. How do you choose your schedule?


VL:
I’ve been doing this for 25 years. We figured out where the Rockies were in the frame, where the light would most likely be coming from. Everything worked, it was pretty effortless, just fun. So much so that I was able to go up toward the very end of the shoot and take some of my favorite images where it was looking straight down at the geometry of the tracks.

image
TK:
That’s your train set image.


VL:
You don’t often get to do that. It’s a question of time, a question of mindset. You are usually so busy trying to get the basic photographs because of all the other challenges. The fact that the shoot went so smoothly allowed me to experiment and go very high. We were already at a high altitude in Pueblo, Co. (elev. 4,700 feet). So I think we flew above 10,000 feet to shoot those series of images. I’ve learned that as a photographer I only make those images when I am very relaxed. And this shoot allowed for that.

imageAbove: An aerial image of the Empire State Building with a fisheye lens.


TK:
Was it a GE helicopter?


VL:
I’ve been specializing on aerial photography for the last ten years. We always do a lot of research before we charter a helicopter and we know who the pilots are. I’ve been flying with my assistant Mike Isler for a decade. He’s also a helicopter pilot and we’re photographing major cities at night from a high altitude. More than 40 million people have seen the photos.


TK:
Now what type of camera are you using?


VL:
We shot this with a prototype camera, the Canon 5DS, so GE was actually, to my knowledge, the first commercial assignment in the United States at least, and possibly anywhere to have been shot with a Canon 5DS, a 50 megapixel camera.

imageAbove: New York’s Grand Central Terminal.


TK:
In some pictures you make parts of the locomotive appear blurry. Were you using a special lens?


VL:
That’s a technique that I revived about a decade ago. It’s been around forever but wasn’t very popular. But then in 2006, I shot a series of tilt-shift images, basically rotating the front lens element up and down and making everything above or below the center go out of focus (see above). The visual effect makes everything look like a miniature, the entire world looks like a miniature, even a 200-ton locomotive.


TK:
This sort of brings you back to your basement.


VL:
It’s the full circle thing.

imageAbove. A night image of London’s Tower Bridge from Laforet’s latest project, Air. All images by Vincent Laforet except where noted

How Insights from Building Jet Engines Help Doctors Spot Faulty Insurance Claim Denials

$
0
0
How Insights from Building Jet Engines Help Doctors Spot Faulty Insurance Claim Denials 0

It’s an endless headache, a migraine really, for American health organizations and patients alike: claims for treatment denied by insurance companies, causing endless frustrating phone calls to get payment disputes resolved. Now, thanks to an innovation made across multiple GE businesses, relief could be at hand.

The aptly named DenialsIQ™ debuted at the Healthcare Information and Management Systems Society (HIMSS) conference in Chicago earlier this year. The software, developed by GE Healthcare, uses advanced analytics to help health systems find claims that were initially denied by insurance companies.

At the core of the system is a patent-pending statistical algorithm that analyzes denials.

imageDenialsIQ harnesses big data to streamline healthcare information to find claims that were initially denied by insurance companies. 

Health systems and patients both benefit. The former can help ensure they are receiving their payments accurately and the latter live with less stress.

DenialsIQ is also an example of what company executives call the GE Store — the sharing of ideas and expertise between GE’s businesses that make the combined company more valuable than the sum of its parts. That’s because ideas and insights that drive the software came from GE’s aviation, capital and healthcare IT businesses.

The innovator who put the disparate insights together was Marc Edgar, an information scientist at GE Global Research in Niskauyna, New York.

Since Edgar joined GE in 1994, he has harnessed data to improve the design and efficiency of jet engines, wind turbines, power plants and medical equipment.

imageThe software picks out trends in claims denials, enabling healthcare professionals to spot patterns and manage claims more efficiently.

DenialsIQ first came to him while he was working with GE Aviation to improve the performance and manufacturing quality of aircraft engines.  “The core of the DenialsIQ algorithm actually goes back to GE’s Six Sigma thinking about the causes of variation and defects,” Edgar says.

He wanted to find a way to analyze the reams of manufacturing data, but humans only get the answers to the questions they think to ask. Modern computing, however, can model hundreds of billions of scenarios and hypotheses and ask the right questions to uncover the most interesting and actionable items, he said. That was the first piece of the puzzle.

Then, when he was working with GE Capital, Edgar learned that how we process financial payment transactions for healthcare providers often contained “defects,” just like the manufacturing process in aviation.

The idea finally came together when Edgar started working in 2012 with GE’s healthcare IT team on solving problems around customer expenditures. In 2013, he visited a New York hospital and witnessed a “cafeteria-sized room full of people” working to resolve denials between the health organization and insurance companies. “There has to be a better way,” he remembers thinking.

imageBig data is tough to process when medical records look like this. 

Edgar took sample data sets from a number of healthcare institutions and developed an algorithm to spot patterns in the data that his team could identify as problematic. He took the algorithm to the Healthcare IT team and offered it as a solution to denials processing. “We knew if we could improve that process, it would mean a lot to the financial performance of a healthcare provider.”

DenialsIQ, which officially launched in July, identifies a common pattern in denials, and isolates where the problem is and its root cause. “The value of this algorithm is a game changer for customers,” says Andrew Slotnick, senior manager for product marketing at GE Healthcare’s Clinical Business Solutions unit. “This tool allows you to visualize patterns and drill down on certain types of denials that are most actionable.”

Once a company learns the pattern in their denied claims, they can then rework them in order to get paid. They can also spot and fix problems internally to avoid a repeat.

Edgar said the DenialsIQ algorithm is now being tested in other businesses across GE to identify the root causes of software bugs, problems with financial charges and manufacturing quality issues.

DenialsIQ was built on Predix, GE’s cloud-based platform for collaborating on projects involving big data, and shaped using FastWorks — GE’s approach to building lean start-ups and prototypes, using the agile method of iterative “sprints” of work among collaborating teams.

Customers are paying attention. WESTMED Practice Partners, Orlando Health and UC Irvine gave it a test run and GE found it could surface patterns in up to 80 percent of the total denied claims, and roughly one in every three of those denials are in patterns that had money that could be recovered or an action that could be taken to avoid similar denials in the future. GE Healthcare IT President and Chief Executive Officer Jan de Witte called it a “beautiful application.”

“It took a long time for all of the technical parts of DenialsIQ to incubate and come together,” said Edgar, who enjoys studying 18th century colonial life, including reading and reprinting 18th century newspapers, when he is not working on cutting-edge technology. “My experiences (over the years) informed it.”

The First American Jet Engine Was Born Inside a Power Plant: A GE Store Story

$
0
0
The First American Jet Engine Was Born Inside a Power Plant: A GE Store Story 0

For most people, Thomas Edison is the man who came up with the first practical light bulb. But Edison was also an inveterate entrepreneur who parlayed his patents into new industries and enduring businesses. Take GE, the result of an 1892 merger between his Edison Electric Co. and Thomson-Houston Electric Co. It has since grown into an industrial giant with $148 billion in annual revenues making everything from MRI scanners to gas turbines and jet engines.

Although these businesses may seem very different, they often trace their origin to a his lab and a point in history where Edison, light and electricity intersect. The light bulb led him into X-rays and the medical imaging business, and GE’s expertise in power generation and gas turbine engineering gave birth to the company’s aviation business. (This sharing is a two-way street. Aviation engineers are now helping their colleagues in power generation help build more efficient gas turbines with their jet engine know-how.)

It’s in part because of these synergies – GE executives call this cross-pollination “the GE store” – that GE Power & Water and GE Aviation alone produced a combined $50 billion in revenues in 2014, more than a third of the company’s total. Take a look at their intertwined history.

image

Top image: A GEnx engine for the Dreamliner. GE Aviation has its roots inside a power plant. Image credit: Adam Senatori/GE Reports

 

Edison’s light bulb and the wave of electric devices that followed created a huge demand for electricity. Initially, companies were using piston engines to power generators, but they quickly switched to more efficient steam turbines. In 1903, GE engineers Charles Curtis and William Emmet built what was then the world’s most powerful steam turbine generator for a power plant in Newport, R.I. (see above). It required one-tenth the space and cost two-thirds less than the equivalent piston engine generator.

image

It was also in 1903 that GE hired young turbine engineer Sanford Moss (above). Moss had just received a doctorate in gas turbine research from Cornell University. At GE, he started building a revolutionary radial gas compressor using the centrifugal force to squeeze the air before it enters the gas turbine – the same force pushing riders up into the air on a swing carousel.

Moss’s early experiments failed; his machine guzzled too much fuel and produced too little power. But his patent and his revolutionary compressor design were sound and found many applications: from supplying air to blast furnaces to powering pneumatic tube systems. He didn’t know it, but he had pointed the way to the jet engine before the Wright Brothers even took off.

image

In November 1917 – at the peak of World War I – GE President E.W. Rice received a note from National Advisory Committee for Aeronautics, the predecessor of NASA, asking about Moss’s radial compressor. WWI was the first conflict that involved planes and the agency wanted Moss to improve the performance of the Liberty aircraft engine. The engine was rated 354 horsepower at sea level, but its output dropped by half in thin air at high altitudes. Moss (right in the picure above) believed that he could use his compressor to squeeze the air before it enters the engine, making it denser and recovering the engine’s lost power.

image

Using a mechanical device to fill the cylinders of a piston engine with more air than it would typically ingest is called supercharging. Moss designed a turbosupercharger that used the hot exhaust coming from the Liberty engine to spin his radial turbine and squeeze the air coming inside the engine. In 1918, when he tested the design at 14,000 feet on top of Pike’s Peak, Colo. The engine delivered 352 horsepower, essentially its rated sea level output, and GE entered the aviation business.

image

The first Le Pere biplane powered by a turbosupercharged Liberty engine took off on July 12, 1919. “The General Electric superchargers thus far constructed have been designed to give sea-level absolute pressure at an altitude of 18,000 feet, which involves a compressor that doubles the absolute pressure of the air,” Moss wrote.

image

Planes equipped with Moss’s turbosupercharger set several world altitude records.

imageimage

In 1937, on the eve of World War II, GE received a large order from the Army Air Corps to build turbosuperchargers for Boeing B-17 and Consolidated B-24 bombers, P-38 fighter planes, Republic P-47 Thunderbolts, and other planes. GE opened a dedicated Supercharger Department at Lynn, Mass. In 1939, Moss proposed to build one of the first turboprop engines. Trained as a gas turbine engineer, he later joined the National Aviation Hall of Fame.

image

But GE’s aviation business was just getting started. In 1941, the U.S. government asked GE to bring to production one of the first jet engines developed in England by Sir Frank Whittle. (He was knighted for his feat.) A group of GE engineers called the Hush Hush Boys designed new parts for the engine, redesigned others, tested it and delivered a top-secret working prototype called I-A. On October 1, 1942, the first American jet plane, the Bell XP-59A, took off from Lake Muroc in California for a short flight. The jet age in the U.S. had begun.

The demand for the first jet engines, called J33 and J35, was so high that GE had a hard a time meeting production numbers and the Army outsourced manufacturing to General Motors and Allison.

imageimage

GE decided to double down and invest in more jet engine research. The J33 and J35 engines used a radial – also called centrifugal – turbine to compress air, similar to the design that Moss developed for his turbosuperchargers. But GE engineers started working on an engine with an axial turbine that pushed air through the engine along its axis. (All jet engines use this design today.) The result was the J47 jet engine that powered everything from fighter jets like the F-86 Sabre to the giant Convair B-36 strategic bombers. GE made 35,000 J47 engines, making it the most produced jet engine in history.

image

The J47 also found several off-label applications. The Spirit of America jet car used one, and a pair of them powered what is still the world’s fastest jet-propelled train. They also served on the railroad as heavy-duty snow blowers.

In 1948, GE hired German war refugee and aviation pioneer Gerhard Neumann, who quickly went to work on improving the jet engine. He came up with a revolutionary innovation called the variable stator. It allowed pilots to change the pressure inside the turbine and make planes routinely fly faster than the speed of sound. When GE started testing the first jet engine with Neumann’s variable stator, the J79 (see below), engineers thought that their instruments were malfunctioning because of the amount of power it produced. In the 1960s, a GE-powered XB-70 Valkyrie aircraft was flying in excess of Mach 3, three times the speed of sound.

image

The improved performance made the aviation engineers realize that their variable vanes and other design innovations could also make power plants more efficient.

Converting the engines for land use wasn’t difficult. In 1959, they turned a T58 helicopter engine into a turbine that produced 1,000 horsepower and could be used for generating electricity on land and on boats. A similar machine built around the J79 jet generated 15,000 horsepower. In Cincinnati, where GE Aviation moved from Lynn in teh 1950s, the local utility built a ring of 10 J79 jet engines to power a big electricity generator.

image

The first major application of such turbines, which GE calls “aeroderivatives” because of their aviation heritage, was as power plants for the Navy’s 76,000-ton Spruance-class destroyers. The turbines now also power the world’s fastest passenger ship, Francisco. It can carry 1,000 passengers, 150 cars and travel at 58 knots

image

Today, there are thousands of aeroderivatives working all over the world. Most recently, they have been helping Egypt’s growing economy slake its thirst for electricity.

image

Neumann’s variable vanes (above) are also part of GE’s most advanced gas turbine: the 9HA Harriet, the world’s largest, most powerful and most efficient gas turbine. Two of them can generate the same amount of power as a small nuclear power plant.

image

At the same time, GE Aviation is working on the next-generation jet engine called ADVENT, or Adaptive Versatile Engine Technology (above). “To put it simply, the adaptive cycle engine is a new architecture that takes the best of a commercial engine and combines it with the best of a fighter engine,” says Jed Cox, who leads the ADVENT project for the U.S. Air Force Research Lab.

Wind in the Cloud? How the Digital Wind Farm Will Make Wind Power 20 Percent More Efficient

$
0
0
Wind in the Cloud? How the Digital Wind Farm Will Make Wind Power 20 Percent More Efficient 0

Few people embody the backyard inventor better than Charles Brush. In 1887, he built behind his mansion in Cleveland, Ohio, a 4-ton wind generator with 144 blades and a comet-like tail, and used it to power a set of batteries in his basement. Although by today’s standards the huge, 60-foot machine was massively inefficient, it started a new industry that pushed generations of engineers to make it better. Now GE has decided to go further and improve on the entire wind farm in one fell swoop.

“Every wind farm has a unique profile, like DNA or a fingerprint,” says Keith Longtin, general manager for wind products at GE Renewable Energy. “We thought if we could capture data from the machines about how they interact with the landscape and the wind, we could build a digital twin for each wind farm inside a computer, use it to design the most efficient turbine for each pad on the farm, and then keep optimizing the whole thing.”

GE calls the concept the “digital wind farm” and this week the company has offered the first glimpse at what it’s going to look like.

image

Digital windfarm designers are using a “digital twin” model (see above) residing in the cloud to build and optimize the real-world wind farm. GIF credits: GE Power & Water

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The concept has two key parts: a modular, 2-megawatt wind turbine that can be easily customized for specific locations, and software that can monitor and optimize the wind farm as it generates electricity. GE says that the technology could boost a wind farm’s energy production by as much as 20 percent and create $100 million in extra value over the lifetime of a 100 megawatt farm. That value will come from building the right farm at the right place and then using data to produce predictable and power and further optimize the farm’s performance.

“The world’s electricity demand will grow by 50 percent over the next 20 years, and people want to get there by using reliable, affordable, and sustainable power,” says Steve Bolze, president and CEO of GE Power & Water. “This is the perfect example of using big data, software and the Industrial Internet to drive down the cost of renewable electricity.”

The Industrial Internet is a digital network connecting, collecting and analyzing machine data. GE believes that the Industrial Internet could add $10 to $15 trillion to global GDP in efficiency gains over the next two decades.

image

America’s first wind turbine generated just 12 kilowatts of electricity. Brush built it behind his mansion, in the middle of a 5-acre backyard running along Cleveland’s fashionable Euclid Avenue. In 1892, his company, Brush Electric Co., became part of GE.

Each digital wind farm begins life as a digital twin, a cloud-based computer model of a wind farm at a specific location. The model allows engineers to pick from as many as 20 different turbine configurations – from pole height, to rotor diameter and turbine output – for each pad at the wind farm and design its most efficient real-world doppelganger. “Right now, wind turbines come in given sizes, like T-shirts,” says Ganesh Bell, chief digital office at GE Power & Water. “But the new modular designs allows us to build turbines that are tailor-made for each pad.”

But that’s only half of the story. Just like Apple’s Siri and other machine learning technologies, the digital twin will keep crunching data coming from the wind farm and providing suggestions for making operations even more efficient, based on the software’s insights. Longtin says that operators will be even able to use data to control noise. “If there is a house near the wind farm, we will be able to change the rotor speed depending on the wind direction to stay below the noise threshold,” he says.

image

The data comes from dozens of sensors inside each turbine monitoring everything from the yaw of the nacelle, to the torque of the generator and the speed of the blade tips. The digital twin, which can optimize wind equipment of any make, not just GE’s, gobbles it up and sends back tips for improving performance. “This is a real-time analytical engine using deep data science and machine learning,” Bell says. “There is a lot of physics built into it. We get a picture that feels real, just like driving a car in a new video game. We can do things because we understand the physics – we build turbines – but also because we write software.”

The digital wind farm is built on Predix, a software platform that GE developed specifically for the Industrial Internet. Predix can accommodate any number of apps designed for specific wind farm tasks – from responding to grid demand to maximizing and predicting power output. Says Bell: “This is the start of a big journey for the wind industry.“

image

A GE wind turbine and its digital twin. Image credit: GE Power & Water

 


Cesar Cerrudo: Securing the intelligent City

$
0
0
hand holding mobile phone with cityscape as background

As they invest in smart technologies to improve services and save money, cities also need to step up security against cyber threats.

Cities are incorporating new technologies at an increasingly rapid pace, becoming ever smarter. Newer technologies — along with faster and easier connectivity — allow cities to optimize resources, save money and provide better services to their citizens.

The potential market for smart cities could be more than $1 trillion by 2020, with technology helping to improve everything from traffic control and lighting to energy and water management.

Yet every new innovation brings new challenges. Cities around the world — whether considered smart or not — face significant cyber security threats. These problems could have a direct impact on government, residents and the companies and organizations doing business there. Cyber security in cities is extremely important, but we have yet to fully realize the risk.

Imagine what could happen if one or more technology-reliant services stopped working. What would commuting look like with no working traffic control systems, street lights or public transportation? How would citizens respond to an inadequate supply of electricity or water, dark streets and no cameras? What if waste collection was interrupted during the summer?

These scenarios might not be as unlikely as you think. There are many cyber security problems that could trigger them, such as:

Lack of proper security testing: Cities around the world are implementing new, untested technologies. My latest research found about 200,000 vulnerable and insecure traffic control sensors installed in cities such as Washington D.C., New York, Seattle, San Francisco, London, Lyon and Melbourne.

At IOActive Labs, we constantly find vulnerable technology in use across industries. The same technology is used for critical infrastructure without undergoing any security testing. Although cities may rigorously test devices and systems for functionality and resistance to weather conditions, there is often little or no cyber security testing at all.

Technologies with poor or nonexistent cyber security features: Some vendors claim to implement security features that turn out to be obscure, nonexistent, undocumented or only described in a sales pitch. At IOActive Labs, we continue to encounter vendors with little or no experience in implementing security features, a lack of skilled security people and weak investment in security. Poor security practices are common in industrial systems and devices on the Internet of Things (IoT).

These bad practices are being propagated into smart cities, as well. Most new technologies are wireless, which makes them easy to implement and even easier to hack — if communication is not properly encrypted. Cities frequently lack good encryption, or fail to implement it correctly or turn it on.

Patch deployment and system updates: Because of their complexity, patches are often difficult and costly to install. It is increasingly common for cities to use vulnerable devices and systems, because vendors are either slow to release patches or patches are not available.

Lack of specific Computer Emergency Response Teams (CERTs): Existing CERTs can suffer from problems with coordination and communication. While many cities have plans for how to react on natural disasters, they don’t have any plans for responding to cyber attacks. Cities should be required to prepare for cyber attacks, given how dependent they are becoming on technology. Cities need to develop emergency plans that provides step-by-step procedures to follow during a cyber attack and educate people on how to react. Fast and effective action can be key to preventing bigger problems, including city-wide chaos.

Government bureaucracy: When dealing with security issues, there is no time to lose. On top of time pressures, cities have a shortage of workers with security skills as well as inadequate budgets, training and resources to help workers develop these skills.

Large and complex systems: When a city is running hundreds of systems and devices for critical services, a simple software bug can have huge impact. With so much complexity and interdependency, it is difficult to identify what is exposed and how the system will react.

 

Cities are currently wide open to cyber attacks, which presents a real and immediate danger. The more technology a city uses, the more vulnerable to cyber attacks it is, so the smartest cities face the highest risks. It’s only a matter of time.

For cities, being prepared is key to preventing bigger problems and chaos. That means:

 

  • Ensuring that the infrastructure is secure;
  • Conducting a security audit of technologies before they are implemented; and
  • Preparing an action plan in the case of a cyber attack.

For technology vendors, it’s time to start taking cyber security very seriously and produce more secure products.

 

When we combine the fact that the technology used by smart cities can be easily hacked with the knowledge that there are cyber security problems everywhere, smart cities risk becoming dumb cities.

(Top image: Courtesy of Thinkstock)

 

cerrudo headshotCesar Cerrudo is Chief Technology Officer for IOActive.

A Date with Data: Taking Stock of the Emerging Digital Industrial Economy

$
0
0
Planepic

There wasn’t much talk of Messi but a lot of conversations involving machines talking to machines in certain corners of Barcelona in mid-September, when more than 4,000 humans from over 50 countries converged on this Catalan city for the inaugural Internet of Things Solutions World Congress. This week, the digital caravan moves to San Francisco, where GE is holding its annual Minds + Machines conference in San Francisco – the font of Internet disruption.

Cisco, a big player in the field, believes that there will be 50 billion connected “things” by 2020, including everything from FitBits to jet engines. GE, which is investing billions in “intelligent machines” and turning itself into a digital industrial company, is focusing on heavy-duty end of the IoT spectrum: the Industrial Internet.

The sizes of both events – GE’s expects 1,500 visitors and will take up San Francisco’s imposing Fort Mason – illustrate that intelligent machines are quickly growing up. In fact, their progress invites comparisons with the rapid rise of mobile computing and smartphones.

 

In just one decade, their ecosystem has connected billions of people around the world and transformed how we live, work and interact. The digital industrial world is now starting to see something similar.

Last year, GE and a group of companies including AT&T and Intel formed the Industrial Internet Consortium (IIC) to create a digital ecosystem connecting billions of machines ranging from blowout preventers and locomotives to CT scanners. In just one year, the Consortium has grown to more than 200 members.

Jumping on this accelerating train makes a lot of good sense. There are hundreds of billions of dollars at stake in new growth opportunities in what Colin Parris, GE’s vice president of software research, calls the emerging “data economy.”

Power Couple: The Wind and the Cloud Make it Rain 3

At the Barcelona event, Parris delivered a keynote address on GE’s Digital Twin initiative and he’s also coming to San Francisco.

Parris says that consumer juggernauts like Amazon, Google and Apple have already given birth to their own digital twins modeling human customers and suggesting new business models and value streams.

GE and other industrial companies are now doing the same by extracting, analyzing and modeling with data from machines and factories. The digital twin can simulate the operations of wind turbines, ships and power plants inside a computer and helps engineers to find and optimize the best designs before they even start making things from metal. “Through the digital twin, we will create continuously evolving digital models for machines that will extract value for the Industrial Internet in the same manner that companies in the consumer internet space have done,” Parris says.

One of high points at the Barcelona congress was the introduction of new Industrial Internet test beds that allow IIC members to measure the efficiency of their products. The brains behind a test best developed by GE and Infosys will also be to Minds + Machines in San Francisco this week. It’s already helping engineers find better ways to manage and maintain landing gear.

Jayraj Nair, who is also leading the test beds effort at Infosys, said the partnership with GE is was a great example of how the IIC is encouraging more collaboration and fostering industrial-strength solutions for the Industrial Internet.

Still, the opportunity for collaboration does not come without challenges. Speaking on a panel in Barcelona, Katherine Butler, general counsel for GE Software, said that open-source innovation was still not broadly understood and that more education was needed.

Forums like Mind + Machines are perfect place to get the word out about the rising digital industrial future.

Watch It Live: Minds+Machines 2015

$
0
0
GE-IoT

GE is hosting the fourth annual Minds + Machines annual event, one of the defining moments of the Industrial Internet. This event, which runs September 29 to October 1 in San Francisco, brings together global industry leaders with the best and brightest of the technology world to explore the opportunities and challenges of the Industrial Internet.

 

Watch some of the speeches and panel discussions live at the link below.

The live stream schedule is:

September 29 | 1-3 pm PT

September 30 | 8-8:45 am PT | 5-5:45 pm PT

 

Watch it here:

GE to Build New State-Of-The-Art Engine Plant in Canada to Fill Gap from Ex-Im Bank Lapse

$
0
0
Waukesha2

GE today announced plans to build a new, state-of-the-art “Brilliant Factory” with manufacturing capacity for multiple business lines including Power & Water, Oil & Gas and Transportation in Canada. The plan will create 350 manufacturing jobs in the first phase and will secure access to Canadian Export Finance to fill the gap from the lapse of the U.S. Export-Import Bank.

As part of the plan, GE’s Power & Water business will stop making its trademark orange gas engines in Waukesha, Wis.

The company said today it notified employees in Waukesha and more than 400 U.S. suppliers of its plans. In Wisconsin alone, suppliers generate almost $47 million in revenue from the Waukesha plant.

GE plans to build a new US$265 million state-of-the-art “Brilliant Factory” in Canada that will use data, analytics and software to optimize efficiency and streamline production. The company expects to finish building the factory in 20 months. GE will design it as a flexible production facility that can expand over time and also support manufacturing requirements for other GE businesses.

The new GE plant in Canada will also have back-up capacity to manufacture diesel engine components for GE Transportation. GE currently employs 350 people at its manufacturing facility in Waukesha, building gas engines for compression, mechanical drives and power generation applications.

GE said it would build its new facility north of the border in order to access additional support from the country’s export credit agency, Export Development Canada (EDC). This is the fourth time this month GE announced it was seeking export financing from a foreign government and opening new jobs abroad. The other instances included France and the U.K.

GE and other companies have been seeking to blunt the impact of the U.S. Export-Import Bank’s lapse in operations since June, when the U.S. Congress did not renew its charter. Since then, the bank has been unable to provide new loans, making the United States the only major industrial country to operate without an export credit agency.

“We believe in American manufacturing, but our customers in many cases require ECA (Export Credit Agency) financing for us to bid on projects,” said GE Vice Chairman John Rice. “Without it, we cannot compete and our customers may be forced to select other providers. We know these announcements will have regrettable impact not only on our employees but on the hundreds of U.S. suppliers we work with that cannot move their facilities, but we cannot walk away from our customers.”

In 2014, EDC facilitated exports and investments valued at approximately 100 billion Canadian dollars.  The agency actively supports global expansion for manufacturers based in Canada, giving backing to over 7,000 customers in close to 200 countries last year.

GE said it was currently bidding on $11 billion of projects that require export financing. While more than 60 other countries have export credit agencies (ECAs) that support domestic manufacturing for export, the US does not.

Rice said that EDC joined “a growing list of export credit agencies interested in supporting GE’s global business operations and customer base.”

Let’s Get Connected: GE Digital Chief Bill Ruh Talks About Intelligent Machines and Our Optimized Future

$
0
0
EngineGIF

Bill Ruh believes in intelligent machines as an emerging reality, not a distant sci-fi concept. “I’m looking forward to a future where the power never goes out, where the water is always clean, where airplanes always run on time, and where the health care industry is working to its full capacity,” he says. “Connected, intelligent machines can get us there.”

It’s a bold statement, but then again, Ruh has an inside view. As the head of GE Digital– a brand new business unit helping GE and its customers get connected to the Industrial Internet – his mission is to make machines better by making them smarter.

Starting Tuesday, Ruh will have a lot of company. He will be in San Francisco at GE’s Minds + Machines conference, a 3-day summit that brings together leading thinkers and practitioners in the field, including MIT’s Andrew McAfee, NASA’s Adam Steltzner, the engineer who helped land the rover Curiosity on Mars, and Ed Catmull, a computer scientist and president of Pixar Animation Studios.

Bill Ruh GE

“I’m looking forward to a future where the power never goes out,” says Bill Ruh, head of GE Digital.

Like them, Ruh likes to think big about bits. To him, the Industrial Internet– a network that links machines with software analytics and data in the cloud – is a bigger deal than the Internet of Things, which already connects fitness monitors, smoke detectors, light switches and other consumer devices to apps and smartphones. “There’s a difference between running a smart thermostat in your house and controlling a power plant,” Ruh says. “We work in mission-critical environments.”

So just how much could connected machines and the Industrial Internet change the world? GE believes that they could add $10 to $15 trillion to the global economy over the next 20 years. To start, take a look at your own buying habits, and how data connections have changed the consumer world.

ge_windturbine_li (1)

Machines are already generating many terabytes of data every day, multiple times the information stored in the printed collection of the Library of Congress.

In the 1990s, when the Internet was just becoming well known, many companies built “brochureware,” simple webpages with their phone numbers and hours. But the founders of Amazon, PayPal and other future web giants realized that the consumer world was going through a seismic change. “They said, what is the business opportunity when a billion people get connected?” Ruh says. “The next thing you know, they turned the retail world upside-down.”

GE is now doing something similar in the digital industrial space. It’s looking at business opportunities in a world with billions of connected machines generating many terabytes of performance data every day, the equivalent to several printed collections of the Library of Congress. “When you’re willing to see beyond the financial impact in the next quarter and look further […] you can begin to see new products and new markets that you might never otherwise have thought about,” Ruh says.

Take efficiency. An “intelligent” jet engine, for example, can carry hundreds of sensors monitoring everything from pressures and temperatures to vibrations. The sensors remotely feed the data to the cloud where software can analyze the same information from an entire fleet of such engines, no matter where in the world they are, optimize them and spot problems before they cause an unplanned outage. “Any asset can be optimized, if the machines can share the right information,” Ruh says.

At such scale, even tiny tweaks can lead to huge savings. “We’re one of the world’s biggest generators of electricity,” Ruh says about GE’s power generation technology. “A one percent savings in fuel across our fleet of turbines is worth a little under $7 billion a year for our customers.”

Intelligent machines can also predict when they’re going to break down. Facebook users share with friends their status. Today, machines running on GE’s Predix software platform for the Industrial Internet are updating their human handlers.

While still in the air, a jet engine, for example, could spot a part that’s getting unusually hot and ping an engineer at the next airport that it needs maintenance. In fact, it could even order the right replacement part it needs, and have that part delivered and waiting in the hangar when the engineer gets there, making his job quicker and easier. “We think about this as connecting minds and machines,” he says. “If you don’t figure out the person in this, you won’t achieve the results.”

Looking at the industrial landscape, Ruh worries about speed. “What keeps me up at night is that we’re not going fast enough,” he says. “The opportunities here are so enormous, the benefits to business and industry are so enormous. The faster we can move the better we can make it for everyone.”

This week’s sold out Minds + Machines conference should put him more at ease.

Viewing all 2658 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>