Making the Grade with SSL

Disclaimer: These are the steps that I followed. Please do due diligence and investigate fully before you attempt to modify your own server. Your mileage may vary…

I have a number of websites that run on Windows servers running Internet Information Server (IIS). One of the requirements I pretty much insist on is that if a site allows you to log in, it has to have an encrypted means of communications. We do that by generating a certificate request inside of IIS Manger and sending the certificate signing request (CSR) off to be signed by a certificate provider like GlobalSign, Digicert, etc. The certificate provider with sign the certificate and send you back a blob of text that you can save in a text file with a *.cer extension at the end. You then open up IIS Manager, select the server and Complete the CSR, which installs the certificate on the server. You can then edit the bindings for the website that you want to enable SSL on, add an HTTPS binding and select the certificate.

Easy peasy…you’re done, right? Unfortunately, not quite.

All kinds of security buzz these days about SSL work-arounds and tricks to reduce the security that they provide, funky names like the BEAST attack, POODLE, FREAK, etc. So we want to make sure that the ciphers and encryption techniques that we use are as safe as possible. There are tools available on the web that will hammer your SSL implementation and tell you if there might be any weaknesses. One such online tool is the Qualys SSL Labs test – available at:

I ran the SSL Labs scan on a Windows Server 2008 R2 box running IIS 7.5 that I’d just installed a certificate on. The results were not very good with the out-of-the-box settings – an “F” (see below)

SSL Labs Initial Scan

SSL Labs Initial Scan

The report gives us some feedback on what they think the deficiencies are in your site’s SSL configuration and some links to some more info. In the case of this Windows 2008 R2 server, it’s identified:

  • SSL 2 is supported – it’s old, it’s creaky, and it’s not to be trusted
  • SSL 3 is supported – (see above) and it’s vulnerable to POODLE attack (oh noes – not poodles!)
  • TLS 1.2 isn’t supported – TLS 1.1 isn’t either, but they leave that out, we’ll fix that too
  • Some of the SSL cipher suites advertised by the server as supported are either considered weak, or they don’t support Perfect Forward Secrecy

The first three items we can fix by editing the registry, the last item requires us to modify one of the group policy settings. The standard disclaimers apply – don’t make any changes to your system unless you are a highly trained professional who understands that these changes may cause your system to no-worky and make sure you have a full backup of the system so that you can restore it if things go sideways.

To disable SSL 2.0 & 3.0 and to enable TLS 1.1 & 1.2, I had to run Regedit.exe and go to:


You’ll probably only see one key under Protocols – SSL 2.0 and in my case it only had a Clients key.

SSL 2.0 Initial Values

SSL 2.0 Initial Values

I created a Server key under SSL 2.0 and added a DWord name of “DisabledByDefault” with a data value of “1”. Now the server won’t attempt SSL 2.0 connections.

Disable Server SSL 2.0

Disable Server SSL 2.0 and 3.0

To disable SSL 3.0 create a similar SSL 3.0 key under Protocols, create a key called Server under it and add a Dword with name DisabledByDefault with data value “1” there as well. No more SSL 3.0 served up now.

To enable TLS 1.1 and 1.2, you follow similar steps of creating TLS 1.1 and TLS 1.2 keys under Protocols and creating a Server key under each. This time, however, I added two Dword values under each of their Server keys. One named DisabledByDefault with a data value of “0” (we don’t want them to be disabled) and then add a second DWord named “Enabled” with a data value of “1” (the default is “0”, so you’ll need to change the value to “1” once you create the Dword entry).

Keys to enable TLS 1.1 and 1.2

Keys to enable TLS 1.1 and 1.2

I closed Regedit – no need to “save” as it auto-saved for me.

Next we need to edit the group policy setting that determines which SSL cipher suites that the server will offer up. To edit the group policy on my stand alone server clicked Start -> Run and typed “gpedit.msc” to open the Windows group policy editor snap-in. The entry we want to modify is under:

Computer Configuration -> Administrative Templates -> Network -> SSL Configuration Settings

The entry we want to modify is “SSL Cipher Suite Order” which was “Not Configured” by default. This means that it falls back to the Windows Server default ciphers and ordering.

SSL Cipher Suite Order Default State

SSL Cipher Suite Order Default State

To only serve up ciphers that aren’t weak and that support Perfect Forward Secrecy, I had to choose a subset of ciphers. Luckily, Steve Gibson at GRC shared out a list of ciphers that met that criteria on his site at:

One caveat is that the list that you paste into the group policy editor has to be a single line of comma separated values. No carriage returns or the like. I copied the text from Steve’s site into Notepad and then hit Home + Backspace for each line starting at the bottom until I got a single line of comma-separated values.

Cipher suites in a single line

Cipher suites in a single line

Click the “Enabled” radio button, highlight the default values in the SSL Cipher Suites textbox and delete them, paste in the new values from Notepad (remember, the single line, no line breaks rule), click Apply and Save and we’re done.

SSL Cipher Suite Order Enabled

SSL Cipher Suite Order Enabled

I then closed the group policy editor MMC snap-in, rebooted the server (it won’t take until you reboot) and then went back and re-ran the Qualys SSL Labs test by clicking the “Clear Cache” link. It caches the results from the previous scan, so unless you click the link, you’ll just be looking at the previous scan results.

Qualys SSL Labs A Grade

Qualys SSL Labs A Grade

Voila! We’ve gone from an “F” grade to an “A” grade. Whether the site is actually more secure or not is beyond the scope of this blog post, but if I am being asked to serve up an SSL secure site and it gets an “F” there would be some ‘splainin’ to do.

Hopefully this helps with understanding what steps were required for me to get the “A” grade.

Where Have All The Research Vessels Gone?

That’s the question that we hope to answer with the International Research Ships Schedules & Information project. Imagine with me if you will a site where we can log metadata about the research adventures of all oceanographic research vessels. Think of the opportunities that it would open up for researchers wanting to know who has been exploring in a specific region of the ocean, what they were looking for and, if we can get the metadata exposed, what data they collected with possible links to where it can be discovered. I’m proposing that we continue to develop the pot where all of that ship information and cruise metadata can be cooked, blended together with just the right seasonings (algorithms), and develop the scoops (tools) that would help users, agencies and researchers pull out a portion that suits their needs. I like to call the concept Stone Soup Science. I love the Stone Soup story and think the concept that it conveys in this data context is a perfect fit. (It’s much better than my other depictions – “Show Me The Data!” or <cue AC/DC music>”Dirty Data…Done Dirt Cheap”  ;?)

Stone Soup Science graphic

Stone Soup Science

No, I am not proposing that we build a data warehousing site to hold all of the oceanographic data that’s being collected. There are agencies and organizations all of the world that are doing a great job of doing that already. What I am proposing is that we continue the modernization of the Research Vessels site to help users mine for and discover where RVs have operated in the past. The next step would be to expose links to those data warehouses. I certainly wouldn’t want to have to try to comb through the holdings of NOAA, R2R, IFREMER, CSIRO, etc to find out who has been doing what oceanographic research, where they went and when they were there. I think this is a better one-stop RV shop solution. All research vessels would be added to the site, not just vessels over a certain size, not just vessels that belong to a certain agency, not just vessels that specialize in one facet of science. We can create customized views of those most certainly, but they’d be done by creating queries to the vessel database to return just those vessels that are of interest to the user or association.

A students Ocean Bytes article shows one of the benefits of being able to leverage and repurpose underway ships data. Eric Geiger pulled together the underway surface mapping data from four regional research vessels to create his satellite salinity modelling algorithm as part of his research thesis. It took Eric a significant amount of time to figure out what research vessels had been working in the region that he wanted to investigate, and even more time to be able to get access to the data that the ships had collected. Imagine if we were able to put together a set of online tools that facilitated that type of investigation. That’s part of what we hope to accomplish with the International Research Vessels Schedules & Information site.

Rather than regurgitate the information on the history and future modernization thoughts on International RV project, I’ll refer you to the About Ships page, which does a pretty good job of explaining things.

International RV Tracks 2002-2011

International RV Tracks 2002-2011

The visualization above was a plot that I made of a subset of research vessel cruise tracks from 2002-2011. In blue are the vessels designed as US RVs and in red are the non-US vessels. I think it’s quite intriguing to be able to visualize the locations where we are conducting research, transiting to the location where we plan to research (sensors still collecting underway data) and, sometimes more telling, the gaps in coverage where nobody seems to be going. Many thanks to for the data dump.

RV Hugh R Sharp Ship Track

RV Hugh R Sharp Ship Track

A couple of years back, we helped the RV Hugh R Sharp set up a ship-to-shore data transfer mechanism up using their newly acquired fleet broadband. Every hour or so, the transfer scripts zip up the ships underway data files and transfers the data ashore. It dawned on us that we could peek inside the data archive sent ashore, parse out a subset of the ships underway data in 10 minute intervals, and display the ship track and its underway data in near real time. The RV Sharp Ship Tracker site was born out of this effort (see screenshot above). We’d like to prototype a more user customizable open source version of this code and allow others to use the code for their own ships. This data feed could then be pulled into the Research Vessels site to show a higher resolution ship track for the RVs that participate as well as exposing the ships underway data for possible re-use by other students and researchers. For those institutions that already have a ship tracking application in place, we could develop services that would allow for harvesting and repurposing those data as well. The thought would then be to expose all of the ship information, schedule and track metadata via a web api that would allow others to use and repurpose the data as well. Whether it be on their own sites, displaying just a subset of the ships that they are interested in, or for use on mobile apps. Open data!

No Funding – but that won’t stop us!

I’ve been involved in the development and operation of the International Research Vessels Schedules & Information project since around 1998 or so. The project was previously funded by a collection of sponsors including NSF, NOAA, ONR, NAVO and the USCG, with each of them contributing funds towards the operation of the project until around 2005. Budget cuts at the time plus the reluctance to share upcoming ship schedules post 911 by some agencies resulted in our program losing its funding. I put the project into hot-standby until funding could be obtained to resume project development and metadata collection. The site stayed online and new ship information was added as it was received, but no major reworks of the site underpinnings happened. I’ve done the dog-and-pony show showcasing the site and its potential to a few groups since then attempting to get funding to move forward and while nearly everybody seemed to agree that the project should be funded, no funding ever came.

Well, I’m tired of waiting. Web technologies are advancing at breakneck speeds and it’s time to move this project forward. Funding or no funding. (It’s either that or start working on a Flappy Ships app, make millions, but not contribute towards science ;?)

I’ll be working on the projects on my own time (after work hours and on weekends) as an open source / open data effort. This will be my citizen science contribution in support of ocean research. I’ll lay out a plan of attack on upgrading the backend database, moving it into a SQL Server database for easier scalability as well as access to the geography data types it provides. A couple of most excellent developers have offered to give me assistance and guidance in the quest to modernize the project site and its underpinnings. Currently that’s the ever awesome Chris WoodruffRachel Appel and <insert your name here> ;?)

I’m always open for more help and ideas to maximize the projects capabilities and potential, so if you’d like to lend a hand, do some research, contribute some code, or offer up some other resources (cash, software, training), please let me know by emailing me at The project needs re-architecting, the data tables need normalizing/denormalizing, the web design needs to be majorly spruced up, GIS/mapping strategies and tools need to be figured out, ship data needs to be refreshed, web APIs need to be written, etc. Lots to do!

Help me help science!

Doug White

POV Sport25 Video Glasses for Fun & Research

How many times have you missed an opportunity to get some neat video to share because you didn’t have an extra hand to hold a video camera? Yeah, lots of times, me too! Well, no more!

The other day I ran across a special on for a set of POV Sport25 video glasses and I figured what the heck and pulled the trigger. They arrived this week and I took them for a test drive (literally).

POV Sport25 video glasses

POV Sport25 video glasses

They are a lot lighter than I expected and my mind is racing with all the video opportunities these bad boys will open up for us here at the college. The camera is right in between the lenses (see the tiny pinhole above). Imagine being able to film the first-person view of a grad student tagging a shark, or the deployment and recovery of underwater robots, or even <insert your scenario here>! They came with a glasses case, cleaner, USB cable and – for those doing work indoors – a set of pop-in clear lenses so that you’re not walking around in too dark of an environment.

To give you an idea of what the video quality was, I charged them up and wore them to the parking lot and on my drive downtown. The uploaded video follows. Enjoy!

Atlantic sturgeon arriving earlier in the mid-Atlantic

The unusually warm conditions in the winter and spring of 2012 have resulted in water temperatures up to 3°C warmer than the previous 3 years resulting in comparable Atlantic sturgeon catches off the coast of Delaware occurring 3 weeks earlier than past sampling efforts.  During sampling events for Atlantic sturgeon we have also documented sand tiger sharks arriving off the coast of Delaware in late-March, a full month earlier than documented in previous seasons.

My research, conducted jointly with Dewayne Fox at Delaware State University and Matt Oliver at the University of Delaware, is focused on coastal movements and habitat use of adult Atlantic sturgeon during the marine phase of their life history.  By utilizing acoustic biotelemetry on both traditional fixed array platforms as well as developing mobile array platforms coupled with Mid-Atlantic Regional Association for Coastal Ocean Observing Systems (MARACOOS) I am going to model Atlantic sturgeon distributions in a dynamic coastal marine environment.  This research is particularly relevant given the recent protection of Atlantic sturgeon under the Endangered Species Act.  Determining factors influencing Atlantic sturgeon movements and distributions during their marine migrations will enable dynamic management strategies to reduce mortalities as well as impacts to commercial fisheries, dredging efforts, and vessel traffic.  In addition to allowing for dynamic management strategies the development of models for adult Atlantic sturgeon movements and distributions in relation to dynamic environmental conditions will illustrate how changing environmental conditions are going to impact this Endangered Species moving forward.

Graduate student Matt Breece with a recently telemetered female Atlantic sturgeon off the coast of Delaware

Utilizing New Tagging Technology to Characterize Sand Tiger Shark Habitat

Sand tiger sharks are large bottom dwelling sharks found in the coastal waters of the Eastern North Atlantic, and are known to frequent the Delaware Bay in the summer months. Sand tiger shark populations are currently in danger of over exploitation because they are slow growing and have extremely low birth rates. While we know that the sharks are found within the Delaware Bay during summer months, little is known about their movements during the rest of the year, or what oceanographic conditions limit their spatial extent. There is evidence that these sharks make large coastal migratory movements along the Eastern Seaboard. This makes habitat characterization difficult because the sharks travel throughout such a large area. It is important for managers to know the areas of intensive use by the sharks, and the species assemblages within those areas, in order to protect these apex predators.

Dr. Matthew Oliver and Danielle Haulsee with a sand tiger shark caught in the Delaware Bay.

Our project, a collaboration among Delaware State University’s Dewayne Fox and the University of Delaware’s Matthew Oliver and Danielle Haulsee, will document and characterize the movements of sand tiger sharks, their habitat preferences, and the community assemblages they encounter using new and innovative electronic tagging technology. Sand tiger shark movements will be recorded using passive telemetry in addition to pop-off satellite archival tags. We will also deploy a new type of tagging technology, which acts as a mobile receiver, and will record any encounters with other sharks, fish or other marine animals that have been tagged with acoustic tags. We will then use satellite and remotely sensed data resources from the Mid-Atlantic Regional Association for Coastal Ocean Observing Systems (MARACOOS) to characterize and model the habitats and oceanographic conditions used by sand tiger sharks. This study will give managers a better understanding of the spatiotemporal patterns in sand tiger shark movements along the East Coast, as well as inform management decisions regarding sand tiger shark habitat utilization.

GIS Consultant Takes a Trip to Lewes

I took a fun and enlightening trip to the UD Lewes campus on Thursday. As UD’s Lead Geospatial Information Consultant, I consult on Geographic Information Systems (GIS) for research and general use. I am gearing up my knowledge of wind energy and the atmosphere by taking the “Wind Power Meteorology” course taught by Dr. Cristina Archer of UD’s College of Earth, Ocean, and Environment. I have an interest in this topic and I’m getting more consulting requests involving data from this field.

As part of that class, we took a field trip down to UD’s Lewes campus.  It was a sunny, windy day … perfect for checking out the turbine!  I made the following video with video and photos I collected with my phone, to share some of that experience.

Predicting Sea Surface Salinity from Space

The simplest definition of salinity is how salty the ocean is. Easy enough, right? Why is this basic property of the ocean so important to oceanographers? Well, along with the temperature of the water, the salinity determines how dense it is. The density of the water factors into how it circulates and mixes…or doesn’t mix. Mixing distributes nutrients allowing phytoplankton (and the rest of the food web) to thrive. Globally, salinity affects ocean circulation and can help us understand the planet’s water cycle. Global ocean circulation distributes heat around the planet which affects the climate. Climate change is important to oceanographers; therefore, salinity is important to oceanographers.

Spring Salinity Climatology for the Chesapeake

Spring Salinity Climatology for the Chesapeake

Salinity doesn’t vary that much in the open ocean, but it has a wide range in the coastal ocean. The coast is where fresh water from rivers and salt water in the ocean mix. Measurements of salinity along the coast help us understand the complex mixing between fresh and salty water and how this affects the local biology, physics, and chemistry of the seawater. However, the scope of our measurements is very small. Salinity data is collected by instruments on ships, moorings, and more recently underwater vehicles such as gliders. While these measurements are trusted to be very accurate, their spatial and temporal resolution leaves much to be desired when compared to say daily sea surface temperature estimated from a satellite in space.

So, why can’t we just measure salinity from a satellite?Well, it’s not as simple, but it is possible. NASA’s Aquarius mission which was launched this past August is taking advantage of a set of three advanced radiometers that are sensitive to salinity (1.413 GHz; L-band) and a scatterometer that corrects for the ocean’s surface roughness. With this they plan on measuring global salinity with a relative accuracy of 0.2 psu and a resolution of 150 km. This will provide a tremendous amount of insight on global ocean circulation, the water cycle, and climate change. This is great new for understanding global salinity changes. What about coastal salinity? What if I wanted to know the salinity in the Chesapeake Bay? That’s much smaller than 150 km.

That’s where my project comes in. It involves NASA’s MODIS-Aqua satellite (conveniently already in orbit:, ocean color, and a basic understanding of the hydrography of the coastal Mid-Atlantic Ocean. Here’s how it works: we already know a few things about the color of the ocean, that is, the sunlight reflecting back from the ocean measured by the MODIS-Aqua satellite. We know enough that we can estimate the concentration of the photosynthetic pigment chlorophyll-a. So not only can we see temperature from space, but we can estimate chlorophyll-a concentrations too! Anyway, there are other things in the water that absorb light besides phytoplankton and alter the colors we measure from a satellite.

Spring Salinity Climatology for the Mid-Atlantic

Spring Salinity Climatology for the Mid-Atlantic

We group these other things into a category called colored dissolved organic material or CDOM. CDOM is non-living detritus in the water that either washes off from land or is generated biologically. It absorbs light in the ultraviolet and blue wavelengths, so it’s detectable from satellites. In coastal areas especially, its main source of production is runoff from land. So, CDOM originates from land and we can see a signal of it from satellites that measure color. What’s that have to do with salinity?

You may have already guessed it, but water from land is fresh. So, water in the coastal ocean that is high in CDOM should be fresher than surrounding low CDOM water. Now we have a basic understanding of the hydrography of the coastal Mid-Atlantic Ocean, how it relates to ocean color, and why we need the MODIS-Aqua satellite to measure it. So, I compiled a lot of salinity data from ships (over 2 million data points) in the Mid-Atlantic coastal region (Chesapeake, Delaware, and Hudson estuaries) and matched it with satellite data from the MODIS-Aqua satellite in space and time. Now I have a dataset that contains ocean color and salinity. Using a non-linear fitting technique, I produced an algorithm that can predict what the salinity of the water should be given a certain spectral reflectance. I made a few of these algorithms in the Mid-Atlantic, one specifically for the Chesapeake Bay. It has an error of ±1.72 psu and a resolution of 1 km. This isn’t too bad considering the range in salinity in the Chesapeake is from 0-35 psu, but of course there’s always room for improvement. Even so, this is an important first step for coastal remote sensing of salinity. An algorithm like this can be used to estimate salinity data on the same time and space scale as sea surface temperature. That’s pretty useful. The folks over at the NOAA coastwatch east coast node thought so too. They took my model for the Chesapeake Bay and are now producing experimental near-real time salinity images for the area. The images can be found here: They will test the algorithm to see if it is something they want to use

Climatologies of salinity for all of my models can be downloaded here:

I view this project as an overall support of the NASA Aquarius mission by providing high resolution coastal salinity estimates that are rooted in in situ observations. I hope this information proves to be useful for coastal ocean modeling and understanding the complex process that effect the important resource that is our coasts.

Demobilization and Remobilization of the Hugh R Sharp

Summer is an especially busy time for research vessels. The UNOLS fleet is making increasing use of containerized portable lab vans to shave some time and effort off of offloading the science party from one cruise and loading up the next mission and their gear. They also increase the flexibility of the research vessels by giving them the option to add additional science capabilities and facilities to vessel users. Options include adding:

  • Dry Labs
  • Wet Labs
  • Isotope Labs
  • Clean Labs
  • Cold Labs
  • Additional Berthing

This is a time lapse that we shot of the RV Hugh R Sharp returning from a multi-week scallop survey, unloading one lab van and then loading two more fresh ones before fueling up (both diesel and food) and departing on the next mission. Enjoy!

It’s all about the E-Lec-Tricity

We had a gentleman named Matthew Vest come to the GVis lab the other day to show off his do-it-yourself creation. He was looking for information on whether he might be able to showcase it at the upcoming Coast Day 2011 event that happens each year on the second Sunday of October here at the Hugh R Sharp campus in Lewes.

Matthew has done something that most of us dream about doing, something many of us say we’re going to do, and that same something that most of us never get off our duffs and actually do. He has taken a 1985 Chevy S-10 truck, removed the gasoline engine and tank, and replaced them with an electric drive motor (from a fork lift he says) and a bed full of 6 volt lead acid batteries (aka ‘golf cart batteries’ – 24 in total). The conversion took him about 2 years to complete and cost approximately $10,000 dollars but now he is the proud owner (and creator) of an all-electric vehicle that will to approximately 40-60 miles on a charge.

Matthew Vest' Electric Truck

Matthew Vest’ Electric Truck

Matthew went out of his way to select “Deka” batteries to power his creation, which he says are 100% recyclable. Each of the 24 batteries weighs in at 60 lbs, for a total battery weight of about 1450lbs. These batteries are wired in series to generate the 144 volts DC that power the Warp-9 electric motor that replaces the gas engine. There is one 12 volt battery which is used to power the stock lights, wipers and horn.

Chevy S10 - Batteries in the Back

Chevy S10 – Batteries in the Back

A Curtis 1231c controller is like the brains of the truck, controlling the power flow. A Zivan ‘smart charger’, which runs on standard 110v, sits behind the driver seat. When fully discharged, the batteries take about 10-12 hours to recharge. The only sound that the truck makes when it is running is the sound of the add-on vacuum pump that is also under the hood. It creates the vacuum that assists with the stock braking system of the truck.

Chevy S10 - Under the Hood

Chevy S10 – Under the Hood

Matthew is hoping to touch base with some of the researchers at UD that are involved in the V2G or “Vehicle To Grid” project so that he can assess whether his S10 can also be integrated with the power grid. For more information on V2G and GIEV’s (Grid Integrated Electric Vehicles) you can read more on the Q&A section on the V2G site.

We asked Matt how he got started with the project and he said it just took some research online, a couple of “how to retrofit your gas vehicle into an electric vehicle” books, and some very helpful people on a few of the EV forums. We salute Matt for what turned out to be an excellent EV refit and for his consideration of the environment when he selected the batteries and materials for his electric vehicle project. Well done!


Hurricane Katia Footprints

The ORB Lab was having a meeting in the GVis Lab this week and, as usual, the East Coast US 8-Day Averaged Sea Surface Temperature overlay was up on the screens. Dr. Oliver pointed to the screen and noted that there was a path cutting across the Gulf Stream that was cooler than usual and that it was probably due to upwelling and mixing from hurricane Katia. Sure enough, we loaded up a layer showing Katia’s track and they lined up.

Katia SST Trail

Katia SST Trail

We then checked to see if there was anything noticeable on the East Coast US 8-Day Average Chlorophyll layer and you can see what appears to be a slight bloom in chlorophyll along the track as well (slightly lighter blue).

Katia Cholorophyll Trail

Katia Cholorophyll Trail

Another neat view is the markedly cooler water that you flowing into the bays from the increased river discharge that resulted from the large amounts of rain dropped by hurricane Katia and tropical storm Lee as they passed through.

Cold river water 20110913

Cold river water 20110913

These layers and several others are processed and uploaded daily and made available via the Orb Lab website in the Public Access section. They are exposed via Google Maps interfaces as well as Google Earth embedded views and linkable KMZ file formats. Neat stuff!

« Older posts

© 2015 Ocean Bytes Blog

Theme by Anders NorenUp ↑