- Where We Work
- Who We Are
- Info & Tools
Aclima, Inc., a San Francisco-based company that designs and deploys environmental sensor networks, is partnering with Google Earth Outreach to map and better to understand urban air quality. Google Street View cars can be equipped with Aclima’s mobile sensing platform to measure nitrogen dioxide; nitric oxide; ozone; carbon monoxide; carbon dioxide; methane; black carbon; particulate matter; and Volatile Organic Compounds (VOCs).
As a pilot, in August 2014, Aclima instrumented three Google Street View vehicles to perform a month-long system test in the Denver metro area during the DISCOVER-AQ study conducted by NASA and the US Environmental Protection Agency (EPA). The cars clocked 750 hours of drive time and gathered 150 million data points, correlated with data from EPA stationary measurement sites. EPA provided scientific expertise in study design and instrument operations as part of a Cooperative Research and Development Agreement (CRADA) with Aclima.
We have a profound opportunity to understand how cities live and breathe in an entirely new way by integrating Aclima’s mobile sensing platform with Google Maps and Street View cars. With more than half of the world’s population now living in cities, environmental health is becoming increasingly important to quality of life. Today we’re announcing the success of our integration test with Google, which lays the foundation for generating high resolution maps of air quality in cities.—Davida Herzl, co-founder and CEO of Aclima
This partnership with Aclima enables us to take the next steps in our pilot project to utilize Street View’s existing infrastructure and test Google Maps as an environmental sensing platform for mapping the environment. We hope this information will enable more people to be aware of how our cities live and breathe, and join the dialog on how to make improvements to air quality.—Karin Tuxen-Bettman, Program Manager for Google Earth Outreach
The EPA currently relies on an extensive network of stationary equipment, placed in urban areas, that measure carbon monoxide, nitrogen dioxide, sulfur dioxide, particulate matter, hydrocarbons and photochemical oxidants. The monitoring network is designed for air quality regulation, but does not give a detailed picture of a community or urban area such that people can get a real sense of what air pollution is in their immediate surroundings.Aclima Aclima is pioneering the use of small-scale sensors to map indoor and outdoor environments. Aclima Nodes contain plug-and-play sensor modules; the networks measure a broad and growing spectrum of variables. Aclima manages the deployed network of Nodes around the clock. A suite of network management tools enables >99% uptime, remote troubleshooting and firmware upgrades. The system is compatible with a broad number of communications protocols, from Ethernet to Wifi to Bluetooth. The data processing backend was built from the ground-up on a cloud-based infrastructure. Resilient and scalable, it can support millions of Nodes. The system ingests, converts, and computes real-time sensor data in preparation for streaming to various analytics tools and visual interfaces. Data points are continually evaluated by Aclima’s proprietary algorithmic calibration engine to ensure data quality.
Aclima’s mobile sensing platform on Street View cars complements EPA’s regional air measurement network by introducing a new body of knowledge about air quality at the street level.
Our research partnership with Aclima is helping us understand air pollutants at the local and community level, and how they move in an urban area at the ground level. New mobile air measurements can complement existing stationary measurements for a more detailed picture of personal and community air quality.—Dan Costa, Sc.D., National Program Director, EPA’s Office of Research and Development
This Fall, Aclima and Google will expand mapping efforts to the San Francisco Bay Area and work with communities and scientists to explore applications for this new environmental tool.
The announcement builds on Aclima’s established partnership with Google to map the indoor environment. Together, they have created a network that is the first of its kind, connected across 21 Google offices around the world.
The system processes 500,000,000 data points each day on indoor environmental quality, including comfort measures of temperature, humidity, noise, and light, and air pollutants like carbon dioxide and particulate matter. The information allows Google to evaluate environmental factors in their offices and, in the future, make better decisions on workplace design to support employee wellbeing, productivity and creativity.
In a new study, Roland Berger Strategy Consultants explores the strategy of module consolidation as a solution for the feature- and function-driven increasing complexity of vehicle electronic architectures.
Consumers increasingly expect the latest and greatest in electronics and safety when purchasing a car, regardless of type. Whether it’s an instrument cluster with a graphics rich, fully reconfigurable display or a lane departure warning system, a tremendous amount of processing power and electronic communication is required. The current approach to adding these features to vehicle’s electric/electronic (E/E) architectures is generally “ad-hoc”—i.e., simply adding a new ECU every time a new vehicle feature requires processing power. This has resulted in vehicles with as many as 100 ECUs and more than 100 million lines of code in ultra-luxury cars.
While this solution has worked for a time, the industry is now at a tipping point, with this approach becoming too expensive and adding too much complexity to be sustainable, according to the consulting firm.
Continually adding ECUs is no longer economical. Upgrading a cluster from an analog display to a fully reconfigurable screen with the processor required to run it can add US$150 to a vehicle. In total, advanced versions of the major ECUs in today’s cockpit (IVI, telematics unit, cluster and radio) can add up to more than US$800. Premium car customers might be willing to accept these added costs, but more and more mainstream vehicle buyers expect advanced electronics at a more economical price point.
Patchwork electronic architectures are reaching a point where they are becoming too complex to deliver the experience consumers expect. Adding ECUs increases development complexity, which slows the development process and adds unnecessary costs.
Adding ECUs increases traffic and slows communication on vehicles’ already burdened electronics networks. This can impact the seamlessness of the customer’s experience and doesn’t allow for certain operations to be processed quickly enough to enable features that require fast, reliable input from multiple ECUs.
Additionally, these systems can be too slow and unreliable to perform the level of processing required for automated and connected driving.
All major automotive trends today, from improved cockpit electronics to new ADAS features, are largely enabled by advanced electronics systems. OEMs will not be able to keep up with consumer’s expectations, both in terms of quality and price, if they continue to add ECUs every time they want to add a new feature. A blank sheet approach to electronic architecture design is needed.—Thomas Wendt, a Senior Partner in Roland Berger’s North American Automotive practice
Module consolidation is a technical solution leveraging modern, multicore processing technologies to operate multiple ECUs which all traditionally had their own processors.
In a multicore solution, these ECUs retain dedicated processing space, usually in the form of their own core in the processor. However, a number of redundant components are eliminated, including housings, power supplies, wire mounts and harnesses, as well as the processors themselves, all saving cost. Additionally, ECUs communicate within the processor itself instead of communicating over a network such as the CAN bus; increasing speed and reducing complexity.
Roland Berger’s study, Consolidation in Vehicle Electronic Architectures, quantifies the cost advantages of module consolidation from the perspective of an OEM. For example, taking a sample set of cockpit electronics, Roland Berger conducted a total cost of ownership (TCO) analysis, comparing the cost of independent ECUs to the cost of a consolidation solution running on a multicore processor with the same feature and function set. The result was a TCO advantage of US$175 per vehicle, including direct piece price savings which are just the “tip of the iceberg.” The savings identified also include indirect, yet quantifiable, advantages of consolidation such as weight savings.
Despite module consolidation’s clear advantages, OEMs have been slow to adopt the solution. This is largely due to safety and security concerns related to running multiple ECUs on the same processor. While there is some merit to this concern, advanced suppliers have already developed a solution to this issue, known as “hardware virtualization.”
ECU consolidation roadmap. Although specifics will vary by OEM, in general it is clear that ECU consolidation will take place in stages, with differing groups of ECUs being consolidated, then being further consolidated into full system domain controllers, the report says. The exact groups of ECUs that will be consolidated will vary by OEM.
There will likely be a clear divide between the consolidation of ADAS-related ECUs and cockpit ECUs, due to safety and security concerns (e.g. so a hack of the IVI can’t control the chassis system). It is also clear, according to the report, that Premium OEMs are leading the charge towards consolidation, and given their existing level of infotainment systems and ADAS, have already begun consolidation.
Due to a clear need and availability of a solution it is now time for the automotive community to re-think legacy E/E architectures and adopt consolidated module solutions, the consulting firm suggests. The first movers have an opportunity to capture a tremendous amount of value, both through savings and the ability to offer a superior product to end consumers.
All major trends in the automotive industry are increasing the number and complexity of ECUs—the industry is now at a tipping point, where adding ECUs is no longer sustainable, both economically and functionally. Module consolidation, the use of a single domain controller as a replacement for an independent processing unit for each ECU, is a solution already being made available to address the complexity issues arising from these trends.
…Despite these clear advantages, many OEMs have been slow to adopt consolidated modules. These OEMs risk losing the ability to stay competitive against not only incumbent competitors, but also disruptive market entrants like Google and Apple. Regardless of their current level of adoption, all industry players need to act now to ensure they are positioned to fend off these risks and are able to deliver the user experience consumers demand at the price they expect.—“Consolidation in Vehicle Electronic Architectures”
Intel Corporation and Micron Technology, Inc. unveiled 3D XPoint technology, a non-volatile memory that has the potential to revolutionize any device, application or service that benefits from fast access to large sets of data. Now in production, 3D XPoint technology is a major breakthrough in memory process technology and the first new memory category since the introduction of NAND flash memory in 1989.
The explosion of connected devices and digital services is generating massive amounts of new data. To make this “big data” useful, it must be stored and analyzed quickly, creating challenges for service providers and system builders who must balance cost, power and performance trade-offs when they design memory and storage solutions. 3D XPoint technology combines the performance, density, power, non-volatility and cost advantages of all available memory technologies on the market today, the partners said. The technology is up to 1,000 times faster and has up to 1,000 times greater endurance than NAND, and is 10 times denser than conventional memory.
For decades, the industry has searched for ways to reduce the lag time between the processor and data to allow much faster analysis. This new class of non-volatile memory achieves this goal and brings game-changing performance to memory and storage solutions.—Rob Crooke, senior vice president and general manager of Intel’s Non-Volatile Memory Solutions Group
One of the most significant hurdles in modern computing is the time it takes the processor to reach data on long-term storage. This new class of non-volatile memory is a revolutionary technology that allows for quick access to enormous data sets and enables entirely new applications.—Mark Adams, president of Micron
Following more than a decade of research and development, 3D XPoint technology was built from the ground up to address the need for non-volatile, high-performance, high-endurance and high-capacity storage and memory at an affordable cost. It ushers in a new class of non-volatile memory that significantly reduces latencies, allowing much more data to be stored close to the processor and accessed at speeds previously impossible for non-volatile storage.
The innovative, transistor-less cross point architecture creates a three-dimensional checkerboard where memory cells sit at the intersection of word lines and bit lines, allowing the cells to be addressed individually. As a result, data can be written and read in small sizes, leading to faster and more efficient read/write processes.
3D XPoint innovations include:
Cross Point Array Structure. Perpendicular conductors connect 128 billion densely packed memory cells. Each memory cell stores a single bit of data. This compact structure results in high performance and high density.
Stackable. The initial technology stores 128Gb per die across two stacked memory layers. Future generations of this technology can increase the number of memory layers and/or use traditional lithographic pitch scaling to increase die capacity.
Selector. Memory cells are accessed and written or read by varying the amount of voltage sent to each selector. This eliminates the need for transistors, increasing capacity and reducing cost.
Fast Switching Cell. With a small cell size, fast switching selector, low-latency cross point array, and fast write algorithm, the cell is able to switch states faster than any existing nonvolatile memory technologies today.
3D XPoint technology will sample later this year with select customers, and Intel and Micron are developing individual products based on the technology.
The quarter’s end is a good time to confront what we had expected would happen on the oil and fuel markets with what actually happened. And this past quarter was certainly interesting in its diversity.
During the quarter and after it ended, a number of developments took place that had a profound impact on the expectations of economic growth in the US, Europe and China. These included dramatic negotiations between the Greek government and its creditors that produced a fragile, last-minute agreement (July 13th), a deal with Iran signed on July 14th by the US, Great Britain, France, Germany and Russia, following two years of arduous negotiations, which lifted the economic sanctions against Iran in return for the country limiting its nuclear programme, and uncertain growth prospects for the Chinese economy, which is in desperate need of deep structural reforms rather than another stimulus package. As these developments unfolded, they prompted changes in estimates of future oil and fuel demand and caused financial flows on the currency and commodity markets to fluctuate.
Fuels and crude oil are entirely different commodities
Fuels (such as naphtha, diesel oil, and heavy fuel oil) are commodities and, in common with crude oil, their prices are quoted (or determined) on global commodity markets. The fuel markets, however, have their own dynamics, independent of the forces that drive the oil market. Although fuels are derived from crude oil, they form separate markets, because they are completely different commodities used for very different things. Having a vast and highly liquid market, oil is a financial asset which is in great demand at times of prosperity. Fuel markets are also unlike one another, with gasoline markets behaving differently than diesel oil markets, because the two commodities have different applications. Diesel oil is the primary fuel for heavy transport, where its competitor and closest alternative is LNG, rather than gasoline. Gasoline, on the other hand, is used for light transport and as a fuel in petrochemical production, where its main rival is natural gas. Due to certain supply and demand factors characteristic of these two very distinct commodities, the prices oil and fuels often move in opposite directions.
Margins or price scissors
Refiners buy crude oil at market prices, crack it into fuel products, which they sell at prices also dictated by the market. The differential between the price of benchmark crude (Brent in our case) and petroleum products extracted from it (e.g. gasoline or diesel oil) is called crack spread. By weighting crack spreads by the product slate typical of a region or refinery, we arrive at a model refining margin. Just to clear things up, margin is an unfortunate word in this context as it implies a mark-up on costs, which is usually determined by the producer. This is how it used to work before crude oil and petroleum products began to be traded on commodity exchanges, when the prices of fuels had been set by oil companies. Today the term can be misleading. A refinery could not ask a price for its fuel products above the market price, because no one would buy them. It could not sell them cheaper, either, because revenues below achievable levels would be unacceptable to shareholders. In a nutshell, petroleum product prices, and hence model refining margins, are imposed on refiners by the two unconnected oil and fuel markets. To my mind, price scissors would be a more accurate term than margin. Margin (multiplied by the number of barrels of product sold) must cover total OPEX and generate profits, which are a long-term source of financing business growth.
Links between oil and fuel markets
Does the fact that fuel products are made from oil not create any links between the prices of the two? The links do exist, of course, and the global refining industry’s cost curve is their conceptual illustration. The idea is that the price of a homogeneous product, like gasoline or diesel oil, should be such as to ensure that the least efficient refinery which still finds demand for its products operates at a profit. The cost curve determines the crack spread/model margin which depends on the refining technology used. For a given oil price level, this margin determines the floor price of fuels below which the least efficient refineries on the market begin to run at a loss. When market forces cause the refining margin to shrink below the floor, the least efficient refineries yield under the market pressure and go out of business.
For instance, the end of 2013 and the beginning of 2014 saw an oil price hike reflecting a “fear premium”. Fuel prices did not follow an uptrend, though, due to weak demand in the Atlantic region. As a result, unprecedented declines in refining margins forced many refiners out of business. Orlen’s refining margin in the fourth quarter of 2014 was a mere USD 0.7/b, with the oil price at USD 109/b. Bankruptcies and forced shutdowns of European refineries led to reduced supply, which gradually lifted the prices of fuels and refining margins. Orlen’s refining margin grew to USD 1.3/b in the first quarter of 2014 and to USD 2.5/b in the quarter after that, with oil prices virtually unchanged (at USD 108/b and USD 109/b, respectively).
Where price scissors (margins) are wider than the floor refining margin, technological considerations lose relevance. This has been the case since July 2014, when oil prices began their march downward. The impulse of Libyan oil coming back on the market (http://ffbk.orlen.com/blog/what-s-up-with-the-oil-market ) had nothing to do with petroleum products, so there was no reason for fuel prices to drop. Just the opposite, European refiners, who suffered a heavy blow from margins having fallen below the profitability floor, needed to go back to normal, which they could do with slipping oil prices. With oil prices retreating by the month and with improved refining margins, refineries increased throughput, which in the case of European refineries, thus far facing the problem of excess capacity, was both easy and profitable (lower fixed costs as a percentage of total costs led to improved operating efficiency). After the opportunistic rise in production was placed on the market, the prices fell. And since the situation changed for the entire refining sector, the gradually increasing supply of fuels had to ultimately push market prices down. From June 2014 to January 2015, when Brent oil price hit a low (USD 48/b, down 57%), the market prices of naphtha and diesel oil slid by 55% and 48%, respectively. In the same period, refining margins grew. Orlen’s model (market) refining margin went up from USD 4.8/b in the third quarter of 2014 to USD 5.0/b in the following quarter.
What was happening with retail prices during that time? In June 2014, motorists in Poland paid an average of PLN 5.39 per litre of Eurosuper gasoline and PLN 5.24 per litre of diesel oil. In January 2015, the prices fell to PLN 4.41 and PLN 3.48, with the price of gasoline down 18% and the price of diesel down 17%. Why did they fall at a slower pace than the prices quoted on international markets? Trade on international markets is in “pure” petroleum products, containing no biocomponents or quality-improving additives (which are more expensive and mixed to blends in various proportions, depending on the type of final consumer), while retail prices are charged on final products. In the process of prices being transmitted from global markets, through wholesale, to retail, price fluctuations are evened out due to customer proximity (consumers dislike price hikes, while wholesalers and retailers dislike price drops, so movements in wholesale and retail prices are less sharp and less frequent). These factors are always at play. However, in the period we are looking at, it was the appreciating US dollar that had the greatest impact on retail prices in złotys. The dollar strengthened by 14% against the euro and by as much as 21% against the złoty.
What happened in the second quarter of 2015?
The price of Brent trended upwards from January to April, with the trend reversed in early May. There were several reasons behind the reversal. Besides the factors mentioned earlier, such as concerns over demand continuing its strong momentum from the first quarter of the year and Iran’s expected return to the global oil market, it also turned out that, seeking support from production costs, oil prices rebounded too fast from their January low, limiting the scale of the much-needed reduction in excess supply (http://ffbk.orlen.com/blog/are-crude-oil-prices-rebounding). The effect of price stimuli coming from the oil market (oil price quotes are in US dollars) was dampened by the appreciating dollar (http://ffbk.orlen.com/blog/cheap-oil-and-strong-dollar). In May the rising price of Brent crude slowed down to USD 64.3/b (up 34% on January). In June the price edged down to USD 62/b, with the downtrend continuing into July (with the average price at July 17th of USD 58/b).
By contrast, fuel prices have been trending upwards since January. The prices of naphtha and diesel oil quoted on international markets climbed 46% and 26%, respectively, from January to May 2015. This rise was buoyed up by the expected growth in demand, reinforced by the positive effects of lower oil and fuel prices, which feed through into the economy with a delay of two to three quarters. In June gasoline prices advanced by a further 2.7%, while diesel oil prices dropped 3.7%. It is interesting to note the asymmetry in the market prices of gasoline as compared with diesel oil, which was particularly pronounced in June. It followed from strong demand from the petrochemical industry following the end of the planned maintenance season, further bolstered by the approaching driving season in the US, which coincided with the end of the planned maintenance season in refineries and sizeable demand for gasoline blending components.
Refining margins kept widening with the global market prices of fuels growing faster than oil prices. Orlen’s model refining margin rose from USD 7.5/b in the first quarter of 2015 to the record 9.8% in the second quarter. The rising global market prices of fuels, which pushed margins higher, were also the main reason why retail prices started to march back up again from a low recorded in January 2015. In June drivers paid PLN 4.93 for a litre of Eurosuper gasoline (up 11.8% on January) and PLN 4.74 for a litre of diesel (up 8.2% on January). These price rises would have been less substantial if the US dollar had not strengthened by 1%.
Wide margins are a reason to rejoice for refiners, but they are not to stay for long and are likely to contract to some extent. According to our own oil and fuel market outlook (http://ffbk.orlen.com/blog/waiting-for-expectations-how-will-the-oil-supercycle-end), after excess oil supply has been absorbed by the market and relevant adjustments have been made on the supply side, oil prices will enter a lasting uptrend, rising faster and sooner than fuel prices, which will again depress margins. In the next six to eight quarters, we are likely to witness oil prices rising and falling in search of a balance and fuel demand adjusting to lower prices.
In the last 12 months, 10%, or PLN 0.5 per litre, has been shaved off the prices of Eurosuper gasoline and diesel oil. The prices would be 25% lower if it were not for the stronger dollar. Strong refining margins, combined with the fact that European refining capacities are yet to be fully utilised (working at full capacity, Orlen refineries are an exception here), encourage refiners to increase throughput, while more fuels supplied to the market are a sign of potential future price cuts. In the next three to six months, provided the Greek crisis is contained, the US dollar should stop gaining in value and correct lower. Polish motorists hoping for lower pump prices may well see their expectations materialise.
Leclanché, the Swiss/German manufacturer of Li-ion cells and systems (earlier post), has acquired Belgian battery-maker Trineuron. The acquisition strengthens Leclanché’s product range and system integration capabilities, and will allow Leclanché to increase its offering in terms of Battery Management Systems (BMS) and module design for demanding transport applications, such as electric/hybrid buses, ferries, mining vehicles, AGVs (automated guided vehicles), and so on.
The acquisition of Trineuron also brings a team of 10 experienced engineers and an existing sales backlog and pipeline. Leclanché continues to position itself as one of the few companies manufacturing and offering multiple chemistries and multiple system designs.
Trineuron has a large number of existing customers in transport applications, in particular for AGVs and has a significant sales pipeline.
The commercial activities will be integrated within Leclanché’s Transport Business Unit, while engineering will be integrated within the System Development Team. As part of the acquisition process, Stefan Louis, Managing Director of Trineuron, will join the executive committee of Leclanché in the role of Chief Strategy Officer. The acquisition is in the form of a share deal whereby the company acquires a subsidiary of Emrol against payment of 410,000 Leclanché shares.
The ICCT has published an informative new white paper assessing actions being taken by state and local governments and public utilities to facilitate electric vehicle uptake in the 25 most populous cities in the United States. Across the cities studied the impact of local actions contributed to plug-in electric vehicles accounting for 1.1% of new automobile sales in 2014, about 40% more than the nationwide electric vehicle share. These locations represent 67% of new electric vehicle registrations and 53% of the public electric vehicle charging infrastructure in the U.S.