https://marketscale.com/industries/industrial-iot/putting-iot-to-work-for-u-s-sustainability-goals/
MarketScale Podcast with Daniel Litwin
From the inception of the Industrial Revolution several core ingredients enabled the transformation and growth of industry. Among these core building blocks of the Industrial Revolution namely: access to risk capital, visionary entrepreneurs, available labor, technology, resources and energy. Technology and energy play a crucial role in not only growing industry but enable scale. Technology can open new markets and provide advantage through product differentiation and economies of scale. Energy is literally the fuel that scales operations.
Today technology, built from knowledge and data, is how companies compete. Energy now emerges as even more integral in scaling operations. Just as James Watt developed the first steam powered engine in 1606 commencing the Industrial Revolution, it was the access to available coal with the use of the steam powered pump, invented by Thomas Savery in 1698, that allowed greater access to coal that gave scale to industry.
Most recently, the pending transaction of Salesforce’s (CRM) acquisition of Slack (WORK) after acquiring Tableau last year serves as a reference in valuing the importance of technology is to sustaining market value. The market value of seven companies accounts for 27% of the approximately $31.6 trillion for the S&P 500. Evaluating the industry and market impact of innovative technologies can be viewed through the lens of stock valuations, particularly as it applies to mergers and acquisitions. This article reviews the companies and the technologies from the perspective of market sales opportunity and the economic impact of the technologies based on the price/performance disruption to the industry.
So why are we focusing on energy and data today? Energy, predominantly hydrocarbon fuels such as oil, natural gas and even coal is how people heat their homes and buildings, facilitate transportation, and generate electricity to run lights, computers, machines and equipment. In addition, there is substantial investment focus on the digital economy, Environmental and Social Governance (ESG), and innovative technologies. A common thread among these themes is energy and data.
Data and Energy are the pillars of the digital economy. Energy efficiency can reduce carbon emissions, thereby improve ESG sustainability initiatives. Innovative technologies around energy and data are opening new markets and processes from formulating new business models to structuring and operating businesses.
The climate imperative and investing in energy infrastructure and environmental ESGs are predicted on energy efficiency and relevant performance metrics to evaluate investment allocation decisions. Therefore, our initial emphasis begins with a background on energy consumption with focus on electric consumption trends, carbon footprint, Green House Gas (GHG) emissions, sustainability, electric grid resilience, and technologies that impact energy including Electric Vehicles (EV), energy storage, and Autonomous Driving (AD). Data technologies encompass cloud architecture, Software as a Service (SaaS), Machine Learning (ML) analytics, and the importance of data as the digital transformation gives rise to the digital economy.
Digital Economy Performance Metrics
Before we dive into the financial and competitive analysis, let’s review business models that are disruptive to the status quo. That is are innovative technologies capable of rapid scale and efficiency gains that change the economics of the market and business profitability. In addition, disruptive events, driven primarily by technology, often appear as waves as the adoption of innovative technologies expands through the market.
Prominent technological waves such as the personal computer (PC), followed by the internet and smartphones and most recently social media and cloud computer all manifested themselves in engendering new business models and creating new market opportunities that dramatically changed the status quo among leading companies at the time. We will use the internet and mobile technology waves to explain how the introduction of innovative technologies offering vastly improved means of commerce enabled the development of new services that changed the business landscape.
Most recent advances in technology appear as waves and give rise to new business models and markets. The internet is one example. The internet enables the connection and process of communication over a new channel. The internet allowed one-to-one and one to many communications and the ability to engage, transact and scale using a digital platform that tremendously lowered the cost of engagement. Scale is among the most important attributes of the internet because the cost of digital replication is close to zero.
Mobile and smartphones began a new era in the digital world. The smartphone allowed a large portion of the world to interact with the internet for the first time on a mobile device. The mobile wave provided platform that enabled the introduction of a host of new business models. The introduction of the Apple iPhone gave way to several new services and industries all from your cell phone.
Let’s review the business model impact of innovative technologies as it applies to cost structure.
Cost Structure and Disruptive Innovation
As explained by ARK Investment Management’s Catherine Wood, the rate of cost decline can be used as a proxy for evaluating the disruptive impact of innovative technology. Cost structure improves as unit production expands. As first postulated by Theodore Wright, an aerospace engineer, who postulated that “for every accumulated doubling of aircraft production, costs fell by about 20 percent”. Wright’s Law as it is now known is also called the Learning Curve or Experience Curve and it is found across industries that experience different rates of declining costs.
What is important from the perspective of investment firms such as Ark is that the magnitude of disruptive impact can be gleaned from these declining cost curves. Revenue growth can then be correlated from these declining cost curves. Essentially, demand elasticity and future sales can be derived from the rate of product cost declines.
This is why price/performance and scientific metrics play an important role in evaluating products, services and company competitive positions. For example, the average cellular price per gigabyte (GB) of data is approximately $12.37 in 2020 according to Small business trends. Another example in science, is the physical performance of an LED light assessed by lumens the light output to the amount of energy consumed in watts such as lumens/watt (Lm/W). These metrics are points in time. For more context, the changes over time and magnitude of change provide insight into inflection points, trends, patterns and relationships.
As devices become complex, encompassing separate processors for communications, computing, power, video and various sensors, it is the integration and orchestration of the overall device performance that becomes of greater value to the user. So, price/performance, scientific understanding and economics become more attuned to relationships among these varied and interdependent components.
TAM Expansion Attribute
A data analytics framework is applicable to insight discovery; provides a roadmap towards innovation; and enables capabilities that can optimize approaches to new business models and opportunities. The following paper provides examples revealing how and why to apply visual analytics for discovery, innovation and evaluating new opportunities.
Discover how waveforms and patterns are applied to science and finance, and how customer usage patterns can reveal new approaches to market micro-segmentation and persona classifications. Lastly we’ll reveal how the deployment of IoT devices across the enterprise fuels data flow in the physical world regarding the performance and conditions of business assets.
Introduction
Our theme is applying visual data analytics as a tool for discovery, innovation and evaluating market opportunities. We show how two metrics, price and volume, are able to convey insight and establish price targets for technical analysis. Why energy consumption patterns and waveforms lend themselves to understanding science and classifying human behavior. How proxy metrics can serve as measures for physical events. Why linking granular visibility into processes and the monitoring of conditions and operating performance help build an advantage in the digital economy.
Green Econometrics relies on visual analytics as a core fabric in our data analytics frameworks because visual analytics are integral to discovery, innovation and new opportunity development. Visual insights are easy to understand – allowing business objective and performance metrics to seamlessly transfer across business units. So how do we do it?
What it Means to Your Business
Key Air Quality Metrics
This post explores how the use of three key air quality metrics can improve the health and safety of your business. Occupant health and safety are paramount in the current environment and sensors that detect harmful compounds can serve as front line of defense. Given these uncertain times, efforts to reduce risks and improve environmental conditions, will help to better support employees and build customer trust.
Begin the process by establishing a goal such as sustainability or worker productivity. From your goal or objective identify metrics that are aligned with the goal, and then measure your progress toward the goal. Deploying this process improvement framework will improve your business in measureable ways. In this manner we are transforming metrics and data analytics into performance improvement aligned to desired outcomes including sustainability and energy efficiency.
Our approach is to identify metrics aligned to your goals and objectives and provide an analytics framework to assess performance. This involves data curation, our proprietary data architecture and machine learning algorithm to provide context, perspective and visual insight. Key is performance benchmarking for health, safety, sustainability and energy efficiency. These are core environmental metrics and process capabilities that will transform your business model.
To zero-in on important indoor air health metrics, cost effective sensors are required. Based on health and energy efficiency objectives, we found these core indoor environmental metrics, namely carbon compounds including: CO2 and methane, Volatile Organic Compounds (VOCs), and particles. In our previous post, Green Econometrics discussed Air Changes per Hour (ACH) as a measure of air filtration performance – how many times does the air in a room change in an hour? In this manner, monitoring CO2 levels can serve as a proxy for determining acceptable ACHs. It is more cost effective to monitor the number of room air changes per hour using air filtration than to deploy expensive sensors to detect pathogens and viruses.
Analytics and Business Intelligence provide a framework for process improvement that drives operating efficiencies and enhances business value. Most business owners and managers want to increase business value to benefit shareholders, stakeholders, and investors. Individual investors and investment professionals direct capital towards companies that can demonstrate sustainable value. Changes to performance in revenues, margins, and risks can become a catalyst to invest or divest. Business value is often measured by three performance criteria – revenues, operating margins, and risks. Therefore, factors that contribute to revenue growth, margin expansion, and risk mitigation become the overarching goals to improve business value. We add that sustainable value includes resource conservation and efficiency.
Just how does analytics and business intelligence address revenues, costs, and risks in improving business value? To understand the integration of analytics and business intelligence in improving business value, let’s look at two initiatives in formulating business strategy.
In his book Measure What Matters, John Doer describes how establishing goals and objectives along with the corresponding performance criteria provide a better method to assure that key metrics are aligned to goals and business objectives. This process of mapping performance metrics to business objectives defined as Objectives and Key Results (OKRs) determine what is relevant to measure and track. Adding to OKRs is the balanced scorecard approach which pulls reporting data from each business unit and department and explained by Robert Kaplan and David Norton in their Harvard Business Review article “Using the Balanced Scorecard as a Strategic Management System” to provide an assessment of conditions and performance.
Why the analytics framework for process improvement can translate into substantial benefits around sustainability improvements and energy efficiency. The Coronavirus pandemic has upended social interaction – a new normal, with social distancing and protocols, and so why does sustainability play a crucial role in facilitating a smoother transition into the is new normal. The reason is sustainability engenders confidence. Knowing facilities are safe and that indoor air quality monitoring is vital for occupant health and safety builds confidence. Health and safety are also essential in generating the confidence that changes consumer behavior. Therefore, the process by which you implement a sustainability plan plays an expanding role in orchestrating the activities that adhere to values and performance.
A sustainability framework provides the roadmap to monitor, measure and curate data thus enabling performance benchmarking of conditions and processes. The analytics framework serves as a roadmap to utilize insight gained from data analysis. Currently available tools such as data visual analysis, machine learning algorithms and cloud computing architecture enable cost effective approaches to achieve business and sustainability objectives.
A sustainability framework provides the foundation to drive business value across several dimensions and performance metrics. The use of the sustainability process can drive business value, improve our environment, enhance customer loyalty, and better engage healthier and happier employees while rewarding shareholders and stakeholders with higher business valuations.
The latest data on oil consumption suggest the dip in consumption that appeared in 2008 after the global financial crisis quickly reversed. The contraction in oil has now turned to expansion with consumption up 4% y/y globally.
According to the latest reported information from the Energy Information Administration (EIA), EIA oil consumption is up 4% for 2010 from 2009. The data oil consumption data suggests the global economy has recovered from the financial crisis and is translated into higher oil demand.
Figure 1 Global Oil Demand
WE have seen economic contraction result into declines in oil demand before. Oil demand dropped in the 1979 to 1983 period with of a 10% decline per year. On a global basis, oil demand declined approximately 2% in 2009 from 2008, but is not up nearly 4% in 2010
In the US, oil demand dropped 5.7% in 2008 and 3.7% in 2009 with demand in 2010 increasing 3.8%. The oil consumption trend in the US suggests the decline in oil demand was cyclical as apposed to any structural changes in US consumer demand.
Figure 2 US China & India Oil Consumption
The real story is the growing demand for oil from China and India. According to data from The Centre for Global Energy Studies (CGES) , the demand for oil from China is up 100% from in the last ten years. China’s oil consumption rate has grown from 4.8 million barrels per day (MBPD) to 9.6 MBPD amounting to half of the total US consumption. In 2010 the growth in oil demand in China is up 17%.
The demand for oil in India is also increasing. Oil consumption in India is up 58% in the last ten years and up 8% in 2010.
Figure 3 China and India Oil Demand
The bottom line is that is demand for oil continues to increase and we expect further increase in oil prices.
The worst global economic recession in since the Great Depression seems to be abating. Given the severity of the financial crisis, it might serve to review what impact the recession has had on oil consumption. In addition, what impact did the decline in oil consumption have on atmospheric CO2 concentration levels?
Since 2006, global oil consumption declined by 1.1 million barrels per day (BPD) from 85.2 in 2006 to 84.0 in 2009. Oil consumption in the US declined 9% to 18.8 million from 20.7 million BPD in 2006. Europe experienced a decline of 7% over this same period with a drop of 16.5 million to 15.2 million BPD. However, over this same period, oil consumption in China and India increased 16% and 13%, respectively. This data was complied from the US Department of Energy Information Administration (EIA) and is displayed in the following charts.
To measure how significant the impact has been, the following charts provide some insights in evaluating how deteriorating world economies may have impacted oil consumption and secondly, whether reduced oil consumption has mitigated heightened CO2 levels.
Figure 1 Global Oil Consumption
Source: EIA
From Figure 1, the impact of the global financial crisis is depicted with the decline in global oil consumption. When a comparison is applied to oil consumption between the US China, and India, the relative drop in oil consumption is less discernable.
Figure 2 US, China, and India
Source: EIA
Figure 2 provides a summary of oil consumption of the US, China, and India. A measurable decline in oil consumption can be seen, but only in the US market.
Figure 3 China and India
Source: EIA
Figure 3 demonstrates the steady and pronounced growth in oil consumption for China and India. Despite the global financial crisis, oil consumption significantly expands in China and India due to secular growth from rapid industrialization in both countries. When measured with respect to the European market, China and India have grown from 15% of the oil consumption rate of Europe in 1980 to over 74% of the consumption level in 2010.
Figure 4 CO2 Levels
Source: NOAA
With the decline in global oil consumption, perhaps a positive benefit would be a fall in CO2 levels. The atmospheric CO2 readings in part per million (PPM) where taken from the National Oceanic and Atmospheric Administration (NOAA) from the Mauna Loa CO2 Levels monthly measurements. Figure 4 illustrates the average annual atmospheric CO2 concentration readings in Mauna Loa, Hawaii from 1980 through 2010.
The bottom line is even while global oil consumption declined during the recession, growth in China and India remained unabated and subsequently, CO2 concentrations in the atmosphere continue at elevated levels.
In memory of Jamie Kotula – loved by family, friends, teammates, and school.
With the oppressive heat and appalling humidity along the Eastern Seaboard, one considers the possibility of climate change and the impact of that greenhouse gases may have on our environment. Without developing statistical regression models to gleam any semblance of understating of carbon dioxide’s impact on climate change, let’s just look at some charts that illustrate the changes of CO2 levels though history.
While industry experts and scientist debate whether elevated CO2 levels have an impact on climate change, the scientific data taken from ice core samples strongly suggests CO2 levels have remained in a range of 180-to-299 parts per million (PPM) for the last four-hounded thousand years. Scientists have developed models to suggest that rising CO2 levels contributes to global warning which are subsequently followed by dramatic climate changes that lead to periods of rapid cooling – the ice ages.
Scientific theories suggest that rising global temperatures melts the Polar ice which allows substantial amounts of fresh water to enter the oceans. The fresh water disrupts the ocean currents that are responsible for establishing a nation’s climate. As oceans warm near the equator, the warmer water travels towards each of the Polar areas. The cooler water near the Polar areas sinks and travels towards the equator. These ocean currents allows for stable climates. The issue is that fresh water is less dense because it is not salty like seawater. Therefore, the fresh water does not sink like the cold salinated seawater thereby disrupting the normal flow of the ocean currents.
Figure 1 CO2 Ice Core Data – illustrates the level of CO2 over the last four-hounded thousand years. The Vostok Ice Core CO2 data was compiled by Laboratoire de Glaciologie et de Geophysique de l’Environnement.
Ice Core Data
Figure 1 CO2 Levels – Vostok Ice Core CO2
Source: Laboratoire de Glaciologie et de Geophysique de l’Environnement
If this Ice Core CO2 data is correct, then the current data on atmospheric CO2 levels is quite profound. CO2 data is complied by the National Oceanic and Atmospheric Administration NOAA at the Mauna Loa Observatory in Hawaii. The latest trend indicates CO2 levels for June 2010 are at a mean of 392 ppm versus 339 in June 1980 and 317 in 1960. Clearly these CO2 levels are elevated. The question is what is the impact on our environment.
Aside from the catastrophe in the Gulf of Mexico and the dire need to find an alternative to our dependence on oil, should we not accelerate our efforts to find an alternative energy solution and as a way to mitigate the impact of CO2 on our environment? Maybe investment into alternative energy could help solve multiple problems.
Figure 2 Mauna Loa CO2 Readings
Source: Source data published by the National Oceanic and Atmospheric Administration (NOAA)
The bottom line is that we need to consider the possibility that elevated CO2 levels in our atmosphere could potentially have a detrimental impact on our climate. In any event, limiting our dependence on fossil fuels, the main contributor to CO2, should be paramount. Let us not forget oil is supply-constrained – there are no readily available substitutes aside from electric vehicles, and without a strategy to embrace renewable energy, supply disruptions will have a painful impact on our economy, national security, and environment.
Even if silicon is actually the industry common semiconductor in the majority of electric products, including the solar cells that photovoltaic panels employ to convert sunshine into electricity, it is not really the most effective material readily available. For instance, the semiconductor gallium arsenide and related compound semiconductors offer practically two times the performance as silicon in solar units, however they are rarely utilized in utility-scale applications because of their high production value.
University. of Illinois. teachers J. Rogers and X. Li discovered lower-cost ways to produce thin films of gallium arsenide which also granted usefulness in the types of units they might be incorporated into.
If you can minimize substantially the cost of gallium arsenide and other compound semiconductors, then you could increase their variety of applications.
Typically, gallium arsenide is deposited in a single thin layer on a little wafer. Either the desired device is produced directly on the wafer, or the semiconductor-coated wafer is cut up into chips of the preferred dimension. The Illinois group chose to put in multiple levels of the material on a one wafer, making a layered, “pancake” stack of gallium arsenide thin films.
Figure 1 Thin Film Solar
Source: University of Illinois
If you increase ten levels in one growth, you only have to load the wafer once saving substantially on production costs. Current production processes may require ten separate growths loading and unloading with heat range ramp-up and ramp-down adds to time and costs. If you take into account what is necessary for each growth – the machine, the procedure, the time, the people – the overhead saving derived though the new innovative multi-layer approach, a substantial cost reduction is achieved.
Next the scientists independently peel off the levels and transport them. To complete this, the stacks alternate levels of aluminum arsenide with the gallium arsenide. Bathing the stacks in a solution of acid and an oxidizing agent dissolves the layers of aluminum arsenide, freeing the single thin sheets of gallium arsenide. A soft stamp-like device picks up the levels, one at a time from the top down, for shift to one other substrate – glass, plastic-type or silicon, based on the application. Next the wafer could be used again for an additional growth.
By doing this it’s possible to create considerably more material much more rapidly and much more cost effectively. This process could make mass quantities of material, as compared to simply the thin single-layer way in which it is usually grown.
Freeing the material from the wafer additionally starts the chance of flexible, thin-film electronics produced with gallium arsenide or many other high-speed semiconductors. To make products which can conform but still retain higher performance, which is considerable.
In a document published online May 20 in the magazine Nature the group explains its procedures and shows three types of units making use of gallium arsenide chips made in multilayer stacks: light products, high-speed transistors and solar cells. The creators additionally provide a comprehensive cost comparability.
Another benefit of the multilayer method is the release from area constraints, specifically important for photo voltaic cells. As the levels are removed from the stack, they could be laid out side-by-side on another substrate to create a significantly greater surface area, whereas the typical single-layer process confines area to the size of the wafer.
Figure 2 Solar Arsenium
Source: University of Illinois
For solar panels, you want large area coverage to catch as much sunshine as achievable. In an extreme situation we could grow adequate levels to have ten times the area of the traditional.
After that, the team programs to explore more potential product applications and additional semiconductor resources that might adapt to multilayer growth.
About the Source – Shannon Combs publishes articles for the residential solar power savings web log, her personal hobby weblog focused on recommendations to aid home owners to save energy with solar power.
A combined heat and power system (CHP) is the cogeneration or simultaneous generation of multiple forms of energy in an integrated system. CHP systems consume less fuel than separate heat and power generating systems. According to the Environmental Protection Agency in their Combined Heat and Power Partnership report, (EPA), CHP systems typically consume only three-quarters the amount of energy separate heat and power systems require. By combining both heat and power into the same energy systems, efficiency gains for the total system. Heuristically, high temperature and high pressure fuel ratios results in higher efficiency systems. In addition, the thermal energy produced from the CHP system could be used to drive motor applications or to produce heat, steam, and hot water.
As an initial step to reducing greenhouse gas (GHG) emissions, natural gas turbines could improve overhaul efficiency of 65-80%. In addition, the CHP offers lower greenhouse gas (GHG) emissions in comparison to conventional standalone systems. Gas turbines CHP systems operate under a homodynamic principle called the Brayton cycle. The design characteristics of a CHP gas turbine provide: 1) high electric and total system efficiency; 2) high temperature/quality thermal output for heating or for heat recovery steam power electric generation; 3) offer options for flexible fuels such as propane, natural gas, and landfill gas; 4) high reliability with 3-to-5 years before overhaul running 24/7; and 5) significantly lower GHG emissions.
Figure 1 Gas Turbine CHP System
Figure 1 demonstrates the mechanics and variables of a CHP system. In summary, the CHP technology enables the supply of efficient heat and power while minimizing GHG emissions. Total CHP efficiency is defined as the sum of net power produced plus the thermal output used for heating divided by total fuel input.
The use of methane (natural gas) as the main fuel for the CHP system offers advantages because methane offers the highest hydrogen-to-carbon ratio among fossil fuels, thereby, combusting with the lowest GHG emissions. According to EPA data, the emissions NOx particulates from gas turbines ranges between 0.17-to-0.25 lbs/MWH with no post-combustion emissions control versus 1.0-to-4.2 lbs/MWH for coal fed boilers. The carbon content of natural gas is 34 lbs carbon/MMBtu in comparison to coal at 66 lbs of carbon/MMBtu.
There are two valuable metrics used to measure efficiency for CHP systems. One is the total system efficiency which measures the overall efficiency of the CHP system including heat and electric and the other is the effective electric efficiency which is useful in comparing the CHP electric production versus grid supplied power. These two metrics, the total system and effective electric efficiencies are important for evaluating CHP system. The following provides a guideline foe measuring these two efficiency metrics and can be found at EPA – Efficiency Metrics for CHP Systems
Figure 2 CHP Efficiency
The economics of the CHP system depends on effective use of thermal energy n the exhaust gases. Exhaust gases are primarily applied for heating the facility and could also be applied to heat recovery steam generators (HRSG) to produce additional electric power. The total efficiency of the CHP system is directly proportional to the amount of energy recovered from the thermal exhaust. Another important concept related to CHP efficiency is the power-to-heat ratio. The power-to-heat ratio indicates the proportion of power (electrical or mechanical energy) to heat energy (steam or hot water) produced in the CHP system. The following provides an overview of the economics of a CHP system.
Figure 3 CHP Economics
Figure 3 illustrates the economics of a CHP system in comparison to competing energy sources. While the CHP does not have the low cost of coal in producing electric, the economic value of reducing GHG emissions is quite significant and beyond the scope of this article. However, natural gas prices remain below that of oil and better ways of capturing heat exhaust will further improve CHP efficiency. The bottom line is that natural gas produces less GHG emissions than coal or oil therefore; businesses should consider the benefits of CHP as a source of heat and power.
Last month First Solar (FSLR) achieved a milestone in the solar industry with its announcement of $1 per Watt reducing its production cost for solar modules to 98 cents per watt, thereby braking the $1 per watt price barrier.. While the achievement is great news for the solar industry some studies suggest more work is needed. An article in Popular Mechanics $1 per Watt talks of university studies questioning the scalability of solar given the immense global needs for energy. Last year our post included an article Solar Energy Limits – Possible Constraints in Tellurium Production? discussing possible limits on tellurium production on thin film solar photovoltaic (PV) suppliers.
In addition, Barron’s published an article (March 30, 2009)_ Nightfall Comes to Solar Land providing unique insight into the economics of solar PV suppliers. High oil prices and soaring stock prices on solar PV companies fueled silicon suppliers to ramp production capacity that has now transitioned, according to the Barron’s article, into an over supply of polysilicon used in the production of PV panels and subsequently, eroding the cost advantage established by thin film PV companies such as First Solar and Energy Conversion Devices (ENER) over polysilicon PV firms such as SunPower (SPWRA).
However, the PV panels typically represent approximately half the cost of a solar energy system. The following figure, Solar Installation Costs compares the total cost of installing a solar energy system which includes labor and supporting matertials.
Figure 1 Solar Installation Costs
As illustrated in Figure 1, the panels represent a significant cost of installation, but the labor and support brackets for the PV panels are significant as well. While thin film PV enjoys significantly lower panel costs and is easier to install, the supporting brackets are sometimes more expensive. As prices for silicon fall, the cost disparity between thin film and silicon PV will narrow.
Figure 2 Solar Energy Economics
In Figure 2 Green Econometrics is comparing PV efficiency as measured by watts per square meter versus cost per watt. The selected companies represent a small portion of the global PV suppliers, but do illustrate the position of the leading US suppliers. The ideal model is to lower cost per watt while improving PV efficiency. But be cognizant that PV module cost per watt may not be indicative of the total system costs.
A comparison of wind and solar energy costs is demonstrated by Detronics and offers a useful framework to compare wind and solar costs by kilowatt-hour (KWH). As a caveat, wind and solar resources will vary dramatically by location. In the Detronics example, the costs per KWH represent the production over one year and both wind and solar have 20-year life spans. Over twenty years the 1,000-watt wind systems cost per KWH of $7.35 would average approximately $0.36 per KWH and the 750-watt solar systems cost of $10.68 would amount about $0.53 per KWH over the investment period.
Figure 3 Alternative Energy Pricing
The Alternative Energy Pricing chart was base on research from Solarbuzz which is one of the leading research firms in solar energy. The cost per KWH that Solarbuzz provides is a global average. Even with cost per watt falling below $1.00, the system costs after installation are closer to $5.00 according to Abound Solar (formerly known as AVA Solar) and is still higher than parity with grid with a cost of $0.21 per KWH.
The bottom line is that despite the lower PV panel costs; we are still not at parity with hydrocarbon fuels such as coal and oil. Carbon based taxing or alternative energy stimulus and more investment into alternative energy is required to improve the economics of solar and wind.
As President Obama takes office, energy efficiency takes center stage. One of he fastest roads to energy efficiency is to reduce consumption and the simplest approach to energy conservation is to change a light bulb.
Compact Fluorescent Light bulbs (CFL) recommended by the U.S. Department of Energy (DOE) offer substantial savings to homeowners. In the commercial market, lighting fixtures consume the greatest amount of electric energy; three times the energy consumption of air conditioning. According to research report from the Energy Information Administration (EIA), Commercial Buildings Energy Consumption Survey lighting consumes the largest amount of electricity in commercial buildings as measured by Kilowatt-hours (KWH) per Square Foot
To calculate KWH, multiply the wattage of your lighting fixture x the yearly hours of operation for your facility divided by 1,000. KWH per square foot provides a useful means of measuring the energy intensity of a building. Just divide KWH by the total square footage of the building.
In an energy audit one can determine the energy intensity of your building as measured by KWH/Sq Ft. Figure 1 illustrates the energy intensity by end use according to the EIA’s report in 2008 Electricity Consumption (kWh) Intensities by End Use.
Figure 1 Lighting Consumes Most Energy
Furthermore, as part of the same research from the EIA, most commercial buildings are not using energy efficient lighting. The study finds that most commercial buildings, even those built after 1980, still rely on legacy incandescent and standard fluorescent light fixtures.
Figure 2 Most Commercial Buildings Lack Energy Efficient Lighting
After your energy audit is complete and one knows their energy intensity the next step is to understand the efficiency of lighting systems. Lighting efficiency is measured in Lumens per Watt and is calculated by dividing the lumen output of the light by the Watts consumed. A lumen is one foot-candle foot-candle falling on one square foot of area.
While lumen output is important in measuring brightness, color temperature, measured in degrees Kelvin, indicates the hue color temperature of the light and is also important in evaluating lighting systems because lighting systems operating near 5500 degrees Kelvin simulate sunlight at noon. Energy efficient lighting fixtures provide twice the lumens per watt of electricity than legacy metal halide fixtures while offering higher color temperature enabling near daylight rendering.
Figure 3 Energy Efficient Lighting
The bottom line is small steps sometimes produce big results. Retrofitting your building with energy efficient lighting systems saves energy, reduces operating expenses, and improves employee productivity and safety, while saving the environment. A 1.3 KWH reduction in power consumption reduces carbon dioxide (CO2) emissions by 1 pound. Coal generates about half the electric power in the U.S. and produces roughly ¾ of a pound of CO2 for every KWH of electric. In addition, the feasibility of alternative energy such as solar and wind are more viable by reducing energy consumption in buildings.