Insurtech, risk, weather, big data, insurance, weather analytics

Weather Risk and Big Data

By : admin |June 01, 2018 |weather-report |0 Comment

Weather Analytics (soon to be Athenium Analytics after our acquisition of and merger with Athenium! BIG NEWS) has joined the Weather Risk Management Association (WRMA) and will attend the 20th WRMA conference June 6th-8th in Miami, FL. Weather Analytics is advancing the services and sectors covered under the weather risk market through Big Data.

Most are familiar with traditional catastrophic weather risk. Wind causing trees to fall on houses, hail striking cars and rivers flooding homes all demonstrate how dangerous mother nature can be. Although these events don’t occur too often, they can be devastating to those affected. That’s why insurance policies are purchased to protect property for these low probability, high-impact losses, transferring the catastrophic weather risk.

Insurers need Big Data as greater understanding of the risks leads to more efficient underwriting and claims processes. Weather Analytics’ high-resolution data and analysis have expertly aided assessment of this class of risk for years. Our clients utilize the product suite through user-friendly applications. Gauge provides risk scoring for underwriting support based on our historical weather data. Beacon allows carriers to alert their insureds of incoming hazardous weather for purposes of loss mitigation and claims preparation. Dexter, a post-event forensics tool, recently gave a broader view of the hail that occurred in the Northeast on May 15th, 2018. Notice, in the image below, the resolution (0.6×0.6 mile grids) providing more information to insurers compared to the traditional observations in purple.

dexter, Weather analytics, risk, hail, insurtech

Non-Catastrophic Weather Risk

So what about non-catastrophic weather risk? Think of a concert being canceled due to rain. The stadium loses money when they don’t get to charge $10 for hot dogs. This is a risk to the stadium’s anticipated yearly revenue. Other examples include a lack of snowfall deterring skiers from buying lift tickets, warm winters lowering natural gas demand and clouds decreasing solar power production. These events tend to happen often, but without catastrophic financial effect (high probability, low impact).

Investment vehicles like weather derivatives may be used to hedge these non-cat risks. Weather derivatives are financial contracts based on an underlying meteorological variable. Standardized contracts are traded through exchanges like the Chicago Mercantile Exchange, while more customizable contracts are found in the Over-The-Counter (OTC) market. Buyers usually pay a premium and receive a pay out if certain weather conditions occur in a certain area and time. Here’s an example:

A frozen lemonade company recognizes that when summer temperatures in the Northeast are cooler than normal, the revenue in that region drops. The company purchases a weather derivative that would payout if cooler temperatures occur, replacing the revenue lost when not as many people need a delicious frozen drink. If the summer turned out to be a scorcher, revenues will likely be through the roof and the company wouldn’t mind the lost premium money.

data, global, risk, weather analytics, insurtech, derivatives

Big Data to the Rescue!

Reinsurers and other weather derivatives dealers rely on historical data to accurately price contracts. Increases in the granularity of the weather data allow for OTC contracts to more efficiently hedge the risk of the individual. That’s where Weather Analytics comes in–delivering global climate intelligence by providing statistically stable, gap-free data formed by an extensive collection of historical, current and forecasted weather content, coupled with proprietary analytics and methodologies. The hourly ground station and 30 km gridded data covers 39 years of hyper-local weather variables, allowing dealers to address their clients’ specific global weather risk.
For more information how Weather Analytics assists top weather risk market participants, e-mail me here:

Matt Davey
Lead for Weather Derivatives

Read More
Hurricane Forecasting

A Look Back on Forecasting Hurricane Harvey

By : Emmett Soldati |September 05, 2017 |weather-report |0 Comment

Hurricane Harvey made its mark on history as one of the most devastating hurricanes to hit a coastal region of the United States. It stands as a powerful reminder to insurers about the value of early – and accurate – warnings of landfall. While hurricane season remains open, a fresh take on hurricane forecasting emerges to help us assess the past and plan better for the future.


Throughout the life-cycle of Hurricane Harvey, Weather Analytics published its 10-day ensemble hurricane forecast model, Beacon, to anticipate storm movements. This forecast, built with observational and forecast data for over 200 tropical cyclones combined with machine-learning algorithms, exhibited impressive accuracy in forecasting the track of Hurricane Harvey prior to making landfall.

This figure shows the overall recorded track of Hurricane Harvey as of August 30th.

This figure from Weather Analytics’ Beacon Hurricane platform shows a forecast of Harvey two days prior to making landfall.


The tool, Beacon Hurricane, is the country’s leading hurricane forecast built on an artificial intelligence platform that pulls together all major global hurricane forecasts – including the Canadian Model, the European Centre, and the U.S.’ National Hurricane Center model.  Originally developed for a U.S. Government agency, the model uses pattern recognition to bias-correct forecasts based on the historic performance of a given forecast from the last 30 years.

“When such an Agency asks for a better hurricane forecast – an agency tasked with managing, among other things, some of our Nation’s most critical infrastructure – it’s all hands on deck – our meteorologists, data scientists and engineers.” says Bill Pardue, Chairman and CEO of Weather Analytics. “The novelty here was combining a team of skilled meteorologists who had served in the deepest levels of global forecasting outfits, with a team of machine-learning data-scientists who are capable of building highly astute pattern-recognition software.  What we didn’t know until the development started is that even something as dynamic as hurricane forecasts are subject to patterns and biases that we can control for.”

This figure shows the forecasted ‘cone of uncertainty’ of Harvey by The National Hurricane Center when Harvey was 36 hours away from making landfall.

This figure shows the forecasted tracks of Harvey by the Weather Analytics Beacon Hurricane forecast model when Harvey was 36 hours away from making landfall.

As senior atmospheric scientist at Weather Analytics, Dr. Stefan Cecelski puts it, “this approach brings together the best of both worlds: state-of-the-industry hurricane ensemble forecasts with the latest in machine-learning data science techniques.”

As insurers look to Irma, the tropical storm currently forming in the south Atlantic, this new predictive analytics technology will provide a timelier opportunity to understand, with confidence, the direction the impending hurricane will take. This will help customers better forecast downstream effects and potential damage before it occurs.

Weather Analytics urges you to contribute in whatever way you can to the relief effort in the aftermath of Hurricane Harvey.

1. National Hurricane Center Harvey Graphics Archive. –

Read More

Computer Vision Comes of Age

By : Emmett Soldati |February 13, 2017 |weather-report |0 Comment

Computer vision is all the rage in tech.  Relying on algorithms to review and make judgment calls on a collection of images presents a host of new applications in the consumer and business-to-business market. Combining imagery analytics with weather and atmospheric analytics, computer-vision will soon be reality at forward-leaning insurance companies.

Property and casualty insurance firms have an increased need and appetite for this kind of innovation. While they’ve built massive databases, full of extensive property-feature information, much of this data collection has happened over extended periods of time – through a sea change of technological advances, new demographics, and portfolio acquisitions.  All of this change stands against an evolving risk landscape of natural catastrophes, convective storms, wildfires, and other environmental perils.  The ‘known-knowns’ of typical property portfolio exposures are shifting.  But upgrading these books of business is no small task.

Future-Proofing Your Portfolio

Changes to the risk landscape – especially weather and environmental threats – present new underwriting challenges that require new data collection.  Was the underwriter aware that the region was prone to wildfire spread?  Did the coastal property business anticipate private flood insurance when they sent surveyors over a decade ago?  Understanding the present and future risks puts underwriters in the right place asking the right questions today, to future-proof their data.  However, even when underwriters are asking the right questions, and collecting the right information, carriers do not always have clean, accurate, accessible, and machine-readable data.

Enter imagery analytics – the deus ex machina for this actuarial coming-of-technological-age tale.  The predecessor for this field of analytics was Keyhole Inc., an Earth imagery company with strong ties to the U.S. Intelligence Community.  Google purchased Keyhole in 2004, renamed it “Google Earth” and began adding significant new functionality and content.  But it would be almost a decade before drivers in insurance, financial markets, energy, and agriculture would transform the consumer novelty of Google Earth’s imagery insights into commercial table-stakes.

The key was training computers to recognize objects in images and automatically detect changes in those objects, through machine learning techniques.  Bringing this pattern-recognition protocol to the visual domain relies on a burgeoning data science field known as neural networks.  Many people see the impacts of neural networks through facial recognition software, but don’t really understand the science behind the powerful ability for computers to detect seemingly complex images or faces.

The premise is fairly simple – train a model to read pixel layers, and the proximity of certain pixels to other pixels, to recognize patterns and categorize similar features. Then store these features in computer memory and search and tag similar features in new sets of images. This capability allows Snapchat to recognize where your nose is, to swap it with a cartoon dog’s nose, or helps your search engine find thousands of cat images from a Google image search. When applied to insured houses and businesses, it can transform a sea of American homes and roofs into underwriting intelligence.

Unlike faces, or cats, the availability of useable roofing and property imagery is surprisingly limited.  The sky-race to get the right equipment in the air plays out between cube-sat, drone, and aerial fly-over companies, while some other organizations opt to analyze public and open source data already collected. The test of these technologies comes down to speed of deployment, geographic coverage, and image resolution.  The three leading imagery-capture technologies – satellites, drones, and airplanes – each has its strengths and weaknesses.

As Google’s TerraBella (originally SkyBox) and other low-Earth orbiting satellite companies push for optically advanced satellites demonstrate, image resolution matters.  Most of these companies boast a pixel resolution of 30-90cm.  Being able to derive meaningful (and confident) property attributes from this level of resolution is nearly impossible.  Trying to detect roof damage on a single family home from Landsat or other satellite images would be akin to identifying a smile on the face of a blurred-out ‘Perp’ on the hit 90s show COPS. Overtime, the push for these low-earth-orbiting satellites to improve their optical vision as they travel closer to the surface will be the game-changer.

Drones offer the opposite value.  Quick and close fly-overs allow for stunningly high-resolution images of rooftops (2-5cm).  But the hardware is limited in its ability to cover ground.  Most drone imagery companies have focused their deployments in a handful of dense population centers.  Though perhaps beneficial for activities like catastrophic response and claims handling, this does little to assist with the more massive portfolio upgrades of major property carriers.

Aerial fly overs, so far, represent the goldilocks approach of the three.  Planes can get close enough to collect usable and machine-visible imagery, while the speed and frequency of trips allows for much broader coverage.

Equipped with high-resolution and widespread imagery, neural networks analytics can transform these images into actionable data at a property level.  This transformation into “roof type” or “vegetation encroachment” takes some time to train the models – but once the out-of-sample tests deliver a high confidence value, then you’ve got yourself a new portfolio database.

This data, however, is not an end in itself. Property features, absent risk, are glorified tax maps.  Fusing property features with atmospheric perils, analyzed over time, compared against real-life claims and loss information, is a recipe for the underwriter’s true dream: insight.

Like our name suggests, Weather Analytics is, at its core, a data analytics company.  We deploy data science expertise to understand and predict new risks from a climatological and atmospheric perspective – including risk scoring for under-modeled perils like hail and winter storms, to building and broadcasting the world’s most advanced machine-learning hurricane forecast.   Weather has always been our main subject.  As our company has expanded over the last several years, however, understanding the object of interest – the domain that the weather risks apply to – be them properties, infrastructure, crops, or even energy grids – has pushed the company into less chartered territory. Deriving the property characteristics for our customers is the first hurdle, but as environmental risk experts, the important question to follow is, “so what?”

Ice Dam Case Study

For some primary insurance carriers, the “so what?” is asking how they can transform their property portfolio into a hierarchy of exposure. Based on years of rigorous underwriting, one New England insurance client of Weather Analytics already has substantial machine-readable property data on hand – from year built to architectural style to heating unit.  But how to make sense of it all? In a recent work program, Weather Analytics transformed this grab bag of property features into acute underwriting logic around one of New England’s most maddening winter phenomena – ice dams.

Analysis of over 1,000 policies and 500 claims related to winter roof damage from the 2015 winter season revealed distinct correlations to property features and risk of ice dam.  Running a feature-selection model similar to Weather Analytics’ crop yield forecast, key performance indicators were narrowed down to three main factors contributing to ice dams.  First was roof type (mansard, gabled, flat, etc.), followed by house heating system, and roof material.  Using high-resolution surface and atmospheric weather data, an ice dam model is based on the specific weather events that contribute to roof damage.  Ice dams are complicated beasts that require several atmospheric conditions to fall into place over time.  Weather Analytics’ multi-variable analysis can determine how much snow pack, followed by how much warm (melting) period, and subsequent freeze period, contributed to a claim event.

Furthermore, this prediction model can determine how the severity of an ice damming event might affect the spectrum of roof types and materials most vulnerable to damage.  Using our forecast alerting app, Beacon, our customers will know what properties to focus on when the next storm hits.  And as they look to expand their business, they’ll be able to rely on Weather Analytics imagery analysis to collect and verify property features congruent with the rest of their portfolio.

The implications of computer-vision don’t stop with claims analysis.  Our fusion of weather and property data supports underwriting (e.g. property-level risk scoring, broken down by regional ice dam ‘zones’), risk mitigation and customer engagement (e.g. incentives for roof repair), and claims preparedness.
The future of computer vision will grow, as industries allow technological advances to charter their ‘known-unknowns.’   Weather Analytics can provide insights into what types of data insurers might what to start collecting – and imagery analytics will allow them to deploy data collection in a cost effective and targeted manner.

Read More

Why we care about GOES-R

By : Emmett Soldati |December 19, 2016 |weather-report |0 Comment


It has been 6 years since the U.S. sent a geostationary (one that moves with the rotation of the Earth) satellite into orbit for the purpose of atmospheric observation and data collection.  On November 19th, the National Oceanic and Atmospheric Administration (NOAA) launched the next generation Geostationary Operational Earth Satellite (GOES-R) into orbit.  Satellite imaging technology has greatly improved since the mid-2000s.  So too have the algorithms to derive weather phenomena on the ground based on the radiation measured and cloud images captured from the satellite. Weather observations from space are getting better.

For those customers and organizations whose business depends greatly on understanding the atmosphere, ‘better’ means two things – higher resolution and faster updates.  For the meteorological activity we can observe and extrapolate from these satellites, date from the GOES-R satellite will increase geospatial resolution up to 4 times, and refresh rates (timeliness) will be as frequent as 30 seconds.  Finer and faster data is good, but businesses, such as large insurance companies or agro corporations, need to put that data into context – to analyze it in the environments they operate in to make better, and faster, decisions.



As an atmospheric risk solutions company, Weather Analytics identifies and procures the best-in-class sources of data to address challenges posed by weather and environmental activity.  We leverage, cleanse, and enhance data from multiple public and private sources – and produce our own geophysical data with in-house meteorological experts.  As a solutions company, with a SaaS model to help customers assess and mitigate large global risks in real-time, we know the problems our clients need to solve and how best to deliver actionable insights.  With the enhancements to the meteorological data in the Western Hemisphere that GOES-R will provide, Weather Analytics is ‘at-the-ready’ to provide our clients with the value these upgrades bring.


dexter-logoFirst and foremost, Dexter, the leading weather claims forensics tool, will see an added feature in late 2017 –  lightning detection.  U.S. users of Dexter already receive the highest-resolution verification of weather perils from reliable and quality-controlled sources available – including precision hail, rain, and wind gust reporting.  Dexter’s lightning product is slated to launch in the middle of 2017. As GOES-R is the first total and continuous geostationary satellite to house a lightning imager, the data provided will enhance our user-friendly weather tracker platform – for easy reporting, verification, and claims analysis for lightning strikes.




wildfireWildfires are on the rise globally, and the recent events in the Southeastern United States show us that climatological factors are changing the hot-zones for risk of spread here at in the U.S.  Combining multiple meteorological and topographical data sources, Weather Analytics began developing its comprehensive Wildfire Awareness solution earlier in 2016 – including Wildfire Vulnerability Mapping, damage assessments, and analyzing Fire Weather changes across the globe over the last four decades.  With the enhanced fire signatures captured by the GOES-R Advanced Baseline Imager, Weather Analytics will be able to deliver more timely, and higher resolution, footprints of burn during and after a major fire event.


beacon-logoHurricane forecast models are slated to improve following the enhanced (and rapidly-refreshed) imagery of tropical storms and cyclones.  This is a salient feature of the GOES-R and will provide a major upgrade to numerical weather prediction systems, including those run by governmental organizations..  Good news for Weather Analytics subscribers to Beacon Hurricane, our real-time machine-learning hurricane forecast dashboard.  Weather Analytics scientists have spent 12 months designing the world’s most intelligent multi-model hurricane ensemble – leveraging known hurricane forecast data from the top 3 forecasting agencies – NOAA’s National Centers for Environmental Prediction (NCEP), the European Center for Medium-Range Weather Forecasting (ECMWF), and Environment Canada.  Using machine-learning techniques similar to those used by Amazon, Weather Analytics algorithmically studies and tracks the accuracy of over 90 model forecasts from these agencies and blends them to provide the most-accurate real-time hurricane prediction.  As the underlying models improve by including GOES-R imagery, so too will the resulting Beacon Hurricane forecast.  With added geospatial analytics such as landfall probabilities calculation and maximum probable loss estimates for assets, as well as automated early warning systems tied to forecasted winds and rain, Beacon Hurricane is and will continue to be the new baseline for preparing for and responding to catastrophic tropical cyclone damage.


These are the few of the ways our customers will see, and experience, improvements to their risk mitigation tactics.  Weather Analytics is on a very fast growth rate – with proprietary scientific content expanding into new domains each quarter.  By the time the GOES-R data becomes operational (middle to end of 2017), we’ll likely have developed new solutions to even better leverage these scientific data feeds – ranging from pollution detection, visibility indices, and higher accuracy precipitation metrics.


Read More
Data Visualization

CO2 Emissions v. Vulnerability to Climate Change, by Nation

By : admin |October 12, 2015 |weather-report |0 Comment

Which nations are at the greatest vulnerability to the negative impacts of climate change? According to research, not the nations who caused it. MHA@GW, the online master of health administration from the Milken Institute School of Public Health at the George Washington University shared with us this data visualization to illustrate the comparison between the nations most susceptible to climate change and the nations that emit the highest levels of CO2. They compared data from the Notre Dame Global Adaptation Index (ND-GAIN) to data from the Carbon Dioxide Information Analysis Center (CDIAC).

Data Visualization

See larger version here

Content and visual provided by George Washington University.

Read More

Does NH Shine Bright Enough For SolarCity?

By : admin |August 29, 2015 |weather-report |0 Comment

Mapping the future of the marketplace throughout New Hampshire.


According to SolarCity, one of the nation’s largest residential and commercial solar companies, they have recently entered the New Hampshire marketplace. “SolarCity will make it possible for many New Hampshire homeowners to install solar with no upfront cost and pay less for solar electricity than they pay for utility power, even without including local incentives” (SolarCity). SolarCity has already begun taking orders and they expect to be doing installations for customers this month. They have solar panel leasing and financing programs that allow customers to generate solar energy. This major player in the solar market is supported by Google, Elon Musk, Founder and CEO of SpaceX, as well as other major backers.

Being neighbors with this renewable energy outfit (their NH headquarters is in Manchester, NH and we have an office in Somersworth, NH), we got to thinking about the solar marketplace in NH and its long-term viability. As a room full of meteorologists, data scientists, and computer programmers tend to do, we decided to take the question head on.  We built a Solar Power Production index (“SPP”) to better understand the amount of solar radiation in the state and map this data to identify the best – and worst – regions for potential solar power production.


install solar panels

Of course, SolarCity’s business is not led by solar power alone.  As a consumer-facing company, they are subject to the challenges and opportunities that come with the retail marketplace. Like many of our own customers, SolarCity is impacted by a range of factors beyond the weather. Density of likely customers, disposable income and credit to front the costs are but a few variables that might affect the success of a takeover of the NH energy market.  As a proxy for viable business opportunities, we fused our Solar Power Production index with NH state economic and demographic data to identify best-fit opportunities for marketing and distributing SolarCity’s products.

To begin, we mapped the Solar Power Production index for each city and town in New Hampshire based on 15 years of solar climate data – which includes irradiance levels coming directly from the sun as well as scattered throughout the atmosphere.   Built into this index are temperature and other non-sunshine related variables that might impact the productivity of a given panel (for example, periods of extreme heat tends to reduce the amount of energy produced by a panel).  We used a few well-known Department of Energy algorithms to determine the typical conditions per location based on 15 years of data.

The ranking system to determine the areas more suitable for the solar marketplace was based upon three different variables: SPP, Income, and Population.  We used the 2013 State of New Hampshire public data for population and median income (data from city and town profiles from the New Hampshire Employment Security website) to match up with the individual cities and towns.

The figures below display our findings for each step. Our heat maps indicate which places are the overall best – or worst – targets for SolarCity – a result we call “NH SolarCity Readiness.”

(Note: The areas with no data available on the map are townships and territories in NH not recognized as cities or towns.)


panorama image


Click To Download Full-Sized Maps: Solar – Population – Income – Total


  • Top 5: Solar
      New Durham
  • Top 5: Population
  • Top 5: Income
      Hampton Falls
  • Top 5: SolarCity Readiness
      Hampton Falls

Overall, the southern part of the state of New Hampshire is the strongest candidate for solar energy providers such as SolarCity. Interestingly, there was a strong correlation between the municipalities with the highest populations and median incomes and highest levels of solar radiation. More than a spurious correlation, we know from experience that the sunny seacoast has been a great place to settle and thrive in!

While solar is still far from the mainstream in the state, NH is among the states with the highest electricity rates in the country. With the opportunity for NH residents to look skyward to help reduce the costs of electricity and make solar a much larger part of the power supply there is a good base to go after in this marketplace. While New Hampshire may not rank as high in solar energy production as high-performing states such as California, Nevada, or Arizona, there is certainly room for growth. If we were consulted on this project for SolarCity (we’re standing by the phones, SC!), we’d recommend Bedford, Windham, Hampton Falls, Stratham, and Newfields as initial launches of the program to gain the biggest early adoption.

Weather Analytics helps businesses and organizations see weather differently by highlighting the impacts of the weather on operations and providing decision support solutions.  Quite often, the value of weather data comes not through the data itself, but how it is modeled, fused, or analyzed.  That’s why we call ourselves Weather Analytics.  In the case of SolarCity, we welcome our new neighbors to New Hampshire with a way of seeing NH weather differently.

Blog post contributors: Sean Daigneault, Justin Bloom, Emmett Soldati, Kristen Jewett.

Read More

Why Weather Analytics: Dr. Ellen Cousins

By : admin |April 30, 2015 |weather-report |0 Comment

‘Why Weather Analytics’ is a monthly series about all of the hackers, entrepreneurs, teachers, tornado-chasers and weatherheads who make up the body of Weather Analytics, how they got here, and why. Each month we’ll focus on a new employee, their story, and what about Weather Analytics pulled them in.

This month we’re featuring Dr. Ellen Cousins, Data Scientist at Weather Analytics, and how her work at Dartmouth College and NCAR has informed her work at Weather Analytics.

Dr. Cousins

Dr. Cousins is a data scientist at Weather Analytics, where she uses machine learning and large-scale computing to turn  weather data into actionable information. Through deep statistical analysis and the fusion of weather with external data sets, she has developed real-time predictive models of crop production, has identified relationships between weather conditions and sales for a prospect in the retail sector, and has also developed tools to estimate location-specific risk of high wind gusts. Additionally, she has contributed to creating and updating a large-scale web-enabled database of weather data running in the cloud.

1) Tell me about your career path that led you to this role as a Weather Analytics data scientist? What was the focus of your past research and work at NCAR and Dartmouth?

Ever since a research internship I did as an undergrad, I have been interested in the role of data analysis/statistics in solving challenging technical & scientific problems. This interest carried over into my PhD work at Dartmouth. I studied space physics and electrical engineering, but my research was based on very large data sets of space physics-related observations.  I focused particularly on a decades-long collection of observations from an international network of radars that measure properties of the Earth’s ionosphere (the electrically charged component of the upper atmosphere). I worked on tools to transform the raw data into information that other scientists could easily incorporate into their own research, and I applied new statistical/computing techniques to look for patterns in the observations that I leveraged to develop predictive models.

My postdoctoral work at NCAR was a continuation of my PhD work. I continued to add to my statistics/computing toolbox, and I applied these tools to develop data products and predictive models using several large Earth & space-based data sets. One problem I worked on involved filling in gaps in information about the ionosphere using whatever data was available, together with statistical modeling. The resulting gap-free and stable output is much easier to ingest into other studies or models than the raw observations.

Dr. Cousins

2) How did you first hear about Weather Analytics?

I was nearing the end of my postdoctoral fellowship and exploring options for my next career step. A Weather Analytics Data Scientist job posting showed up in my LinkedIn feed and caught my attention.

Science Workshop

3) What inspired your move to Weather Analytics?

When finishing up my postdoc, I knew that I wanted to try out a position in industry as opposed to academia, primarily because I wanted to see the impacts of my work more quickly and I wanted the opportunity to apply my stats/computing experience to solving a variety of real-world problems.The Weather Analytics Data Scientist position was a perfect step for me. There were enough similarities to my previous work (geospacial data and a connection to geosciences) that I could hit the ground running, but enough differences to make it new and challenging. I liked that the small size and fast pace of a start-up would allow me to get experience with many facets of a data scientist role, rather than be siloed into one specific focus area, and the wide variety of projects to work on would always keep things interesting.

4) Tell me about your job now and what are its core components.

I’m involved in the data side of Weather Analytics at all stages of the data. I work on the tools used to transform the original weather data into something that’s easily accessible and usable. I’m involved in work to create new products from the weather data. And I work on fusing weather data with non-weather data to find correlations and build predictive models. Two aspects of the work have been particularly interesting. The first is finding ways to deal with data at a much larger scale than I’d done before (I’ve learned a lot about databases and I’ve gotten better at writing memory & compute efficient code). The second is in the area of data fusion: it was rewarding to apply basic principles to build from the ground up a predictive model in the area of crop production, something I had no prior experience in.

5) You haven’t yet worked for Weather Analytics a full year yet, but what has been your experience so far working for this company in particular? How have you liked it? What do you like most about your job?

I have loved my job here so far. I enjoy getting to work with a group of really smart people from a variety of different backgrounds. I like the balance of applying existing skills and learning new ones, and the balance of independent and teamwork. And I like seeing my work put into use so quickly and being used by customers for real-world applications.


Read More

Weather Jams – Weather Data Sonification

By : admin |April 15, 2015 |weather-report |0 Comment

Weather Analytics is constantly striving to find new uses for our data, new tools to integrate into our array, and (most importantly) new ways to make our data accessible to the world at large.

It only seemed natural then that we get creative and use data to make a song. This is that story.

Data sonification isn’t new, but it’s not remarkably well known either. CERN has made symphonies from data ( and Black Midi has gotten into the game as well.

For our song, we wanted to start with something simple, someplace close to home.

What follows is the process of how we turned 48 hours of our Washington DC area weather data (From Jan 1st & 2nd, 2014) into music.

Weather Analytics’ standard dataset is broken into 14 standard variables, which includes things like windspeed, rainfall, direct normal sunlight, temperature, etc. These became our notes.

The biggest trouble was finding the right software to make everything we imagined work to create the song utilizing these notes. We tried x, y, and z, before creating a MIDI file and translating it into a score with MuseScore.

Hear the final song here:

[su_vimeo url=”″]

(Prior to using our own data we started the project by trying to make music from the data behind satellite imagery. The data behind the cloud cover amounts for this satellite image in the picture below was translated into notes but the sheer amount of data was difficult to process into a song).

Satellite Image

Read the full documentation by our data intern, Hannah, here:

 I started by downloading xSonify. I initially couldn’t import documents into xSonify, and couldn’t figure out why.  Later on, I figured out that the difficulty was that there were specific instructions needed to be followed on the NASA website, but by that time, an update from Java had changed its security level so that xSonify wouldn’t even open.  So that option didn’t work.

Next, I tried Sonification Sandbox, which consists mainly of a spreadsheet and sound editing tools.

Sonification Sandbox

I started out by replacing the two lines of numbers with two lines of data from my Excel sheet, and that made an interesting tune.  When I put in more lines, however, the document froze, and then wouldn’t make any music whatsoever.  Later on, I realized I had been doing things wrong—the first line of numbers should stay the same because it marked time.  I fixed that, but I still had trouble with trying to copy more lines of data in.  I tried importing the numbers in but that didn’t work either.  At this point, I was only importing 40 instances of each weather variable in at a time.

 Then I tried using Audacity and importing it in that way.  This time I imported everything I could, including all 8,000-some instances of each weather variable.  Audacity ended up producing more noise than music.  Some of it was very interesting noise, such as the wind noise I ended up producing at one point—but most of it was just halting to the ears.  Eventually, I gave up and went back to Sonification Sandbox.

This time, instead of trying to put all the variables in at once, I did them one at a time (60 instances of each), and then exported each one into a midi file.  I downloaded a program called Aria Maestosa, into which I imported the Midi files and played them together.  I tried to pick the most musical ones and make sure they blended. I also changed the key and the tones for each of the sounds to help the music blend.  

 Finally, I exported the result as a MIDI file, and used another software, MuseScore, to convert the final MIDI file into a score. An example of one of the pieces of sheet music generated can be seen below. 

Overall, this was a fun experiment with our weather data to use sonification to put music to a couple of cold days in January with mild winds.

Sheet Music


Read More

Introducing… Sharknado Risk Alerts

By : admin |April 01, 2015 |weather-report |0 Comment

Weather Analytics is excited to announce our latest weather intelligence product, Sharknado Risk Alerts.

This innovative new product will initially provide information on four of the top five surface weather variables that impact the danger of Sharknados: Precipitation, Humidity, Temperature, and Wind Speed. The fifth variable (Wind Direction) will be rolled out in the first product update at the end of April, 2015.

These weather variables will be combined with data from the Sharknando Prediction Center’s hourly updated information on the likelihood of a Sharknado event to provide the end-user with a Sharknado Risk level score.

panorama image


As you can see in the sample screenshot from this new tool, the Sharknado Risk Alerts are ranked on a scale from 0-10 for a given location (up to a 10km x 10km area), with “0” indicating no risk, all the way up to “10” indicating a great peril. Each risk score provides a warning/alert for that particular location.

For instance, in the screenshot above with the Risk Alert score of 8, the tool indicates that, “We recommend that all citizens of the District of Columbia take immediate cover in their home, work or nearest fishing boat.  Strong sharknados with sharks up to 5,000 pounds and winds up to 300 mph expected throughout the day. ”

With an increased occurrence of Sharknado events in current years, we feel confident that these alerts will help provide people with the necessary information to plan and protect themselves. These weather variables, combined with frequently updated Sharknado information, and the ability to set custom alerts is completely new to the field of severe weather event alerting.


For more information on Sharknado Risk Alerts, please contact


(April Fools!)… we fooled ya … huh?

Read More
Ice Castle lit up at night

Do You Want To Build an Ice Castle?

By : admin |March 26, 2015 |weather-report |0 Comment

Why build a snowman when you can build an ice castle? (Sorry Olaf).

Ice Castles got their start in 2006 when creator Brent Christensen began creating vertical ice structures in his back yard with running water. The Ice Castles grew more grandiose with each passing year and he soon found that he could build an infrastructure for the castles by creating icicles and affixing them to together. Thousands upon thousands of icicles get grown and harvested every winter to be added to the foundation of the castle and build up the different structures and forms. By 2009, the Ice Castles hit the public stage by being built at a resort in Utah. In the years since, the Ice Castles have become quite the phenomenon with thousands of people visiting them in locations around the country and plenty of media attention.

According to Ice Castles, The Story Behind the Magic, “Brent’s Ice Castles look like something that materialized out of nature. They are built with icicles – many, many icicles and become something quite magical. The icicles are sprayed with water to create great glacial towers and caves filled with ice formations which hang with stalactites of ice.” These ice structures feel like they are something that materialized out of a story tale – shining bright in the sunshine and lit up at night by a magical display of lights. If you’ve ever seen or even heard of the movie Frozen the Ice Castles certainly conjure up images in your head of a real-life fairytale.

Ice Castle lit up at night

Ice Castle lit up at night

I inquired about the optimal weather for the upkeep to maintain these structures and the master builder for the NH Ice Castles location responded that anything below 0˚F is ok, but between 0˚F and 20˚F degrees is the optimal temperature range. In thinking about this, and with how cold and snowy it is this season, I felt compelled to see how this winter season stacked up against previous winters at this location in New Hampshire and how the temperatures either worked for or against the creation of these ice castles.

At first outlook, the Average Surface Temperature for the 2014-2015 winter season (defined here as October through March) indicates that this winter has been much colder between January and March than it was on average over the past 15 years. This is especially the case for February of 2015, which is 16˚F degrees below the average of the past 15 Februaries. This was actually a great thing for the maintenance of the ice castles as 20˚F is the high end of the optimal range of temperatures for the builders and this lasted through the only full month of the Ice Castles being open to the public (January and March were both partial months).

Surface Temperature

Average Surface Temperature, October 2014 through March 2015

The max temperature of Lincoln, NH in February of this year was much less than the average of the previous 15 years.

Max Surface Temperature

Max Surface Temperature: Months – October 2014 – March 2015

The min temperature of Lincoln, NH in February of this year was also much less than the average of the previous 15 years.

Min Surface Temperature

Minimum Surface Temperature: Months – October 2014 – March 2015

This illustrates (with the minimum and maximum values included) how the 2014/2015 season departs from the average of the past 15 years for Lincoln, NH.

Departure from Average

Departure from 15 Year Average (against winter 2014/2015 season)

Beyond the surface temperature, the other temperature variable considered for this is the “Apparent Temperature” which includes wind chill factor. In looking at the plummeting apparent temperatures in 2015 it can be seen that there was a big dip in the perceived temperature from January into February. Without knowing the sales figures for Ice Castles in this time frame, it is difficult to make any predictions on how this impacted the number of visits to the Ice Castles due to storms and cold weather, but overall, it can be concluded that this year was optimal compared to other years for the actual creation for the ice castles themselves.

Departure from Average - Apparent Temperature

Departure from 15 Year Average (compared to 2014-2015 winter season) for Apparent Temperature

Now, with spring here (at least in spirit), the Ice Castles have closed for the season, and we are looking for an uptick in temperatures for Lincoln, NH. If this year was any indication of years to come with regards to this year’s temperatures than this location should continue to be a great place to be home to New Hampshire’s Ice Castles.

Frozen Water Fountain

Frozen ice fountain bubbling over with water


According to Ice Castles, The Story Behind the Magic, for an average 3 months in a season, an Ice Castle uses around 450,000 icicles!

For more information on the Ice Castles:

Information on the story of the Ice Castles was provided by the book Ice Castles, The Story Behind the Magic (


Read More