Ships, buoys, Search-and-Rescue (SAR) aircraft, and shore-side base stations send out automated messages to let each other know where they are. Kn...
For further actions, you may consider blocking this person and/or reporting abuse
You say users could broadcast text messages to the vessels.
Could you describe the systems receiving these messages? Were those full computers and showed messages on a monitor, or more specialized systems?
The hardware and shipborne side of things wasn't my specialty, but I'll answer to the best of my knowledge.
Ships have an electronic chart (map) system, that shows the ship's own position and other ships around them. I think the messages would show up overlaid on this, or possibly a different screen. I know dedicated systems for this exist, and are usually called ECS or ECDIS, but it could theoretically be anything capable of receiving the data format (including a PC or tablet if it has the right software).
There were at one point a specialized kind of networks for shipborne networking, but my sense is that's moved toward more standard IT stuff (though hopefully with more waterproofing 😅), such as ethernet connections, with the data itself traveling over TCP or UDP.
Edit to add: there weren't many people who had access to our transmit capability. Text messages were supposed to be for safety-related information only, though I think that's not enforced for the ships themselves (the ECS lets you send text messages, I think, but that didn't involve our software).
Pretty interesting, thanks for the reply :).
How is edge computing, new CDN services and concepts, and new web features (offline mode) affecting this space?
I imagine the industry always found ways to solve offline sync issues, but it seems like some of the standards or general-purpose services are catching up with the needs?
Is that the case at all?
At least for the part of the industry I was in, not a lot. My team's web apps didn't use a CDN, but they were mostly only available on the Coast Guard network. I know more public-facing ones used one, but beyond that I don't know how it was set up.
As far as the client side of web apps go, I don't think we had much need for edge computing or offline mode. The end users for those were on dry land.
Cloud servers were beginning to be a thing where I worked, but it was done at an in-house datacenter, as of a few years ago. I think AWS was starting to be viable and certified for government work, but we weren't using it yet.
Elsewhere in the industry, there's the privately-run MarineTraffic, which might take advantage of newer front-end stuff.
My team didn't run it, but we had a viewer like that which used Microsoft Silverlight. I left almost 2 years ago, so everything I'm saying here could have changed (but government work moves slowly, so probably a lot hasn't).
The offline sync issues we had to deal with weren't with web interfaces, but we did have to handle message drops or lost connections along the route data took from receivers, to our main server, to the database. The original system was set up with freshness of the data in mind. If someone was getting a feed from us, it wouldn't matter if a few messages were lost as long as the data they were seeing was current (a lot of messages were duplicates, and anyway the ship would send a new one every few seconds if it was moving). But since our mission was to store everything, we had processes at each receiver, and at our main and DR servers, that would stream everything to a flat file. Then we had processes that would check for missing data, grab the files for each hour (definitely from the main and DR servers, and from the receivers if there had been any connection loss), merge them, and load them into the database, replacing the stuff that had been saved as it streamed in. The fetch and merge stuff was mostly perl scripts written by our sysadmin (who was also a skilled programmer), even though there was supposed to have been some Java code to do it from previous contractors.
By edge computing, I thought you meant using the browser as a compute resource (e.g. cryptojacking but legit, or SETI@Home type stuff).
I didn't even know running the server side of your app on the CDN was a thing until I saw your RailsConf talk just now. That sounds like it would be really helpful. I guess that means you'd have to replicate your database everywhere too, though, wouldn't it?
Again, public-facing stuff like MarineTraffic might use something like that, but there are still a lot of desktop apps (think MFC or win32) out there in the maritime industry.
I used to work with shipping container tracking data! It's not the same, but at first glance I thought it might be. Reading your story, though, I do wonder if we know some of the same people. Sounds like an interesting job you had.
Maybe. Did you interact with government folks a lot for that?
Yeah, this was all Army, though. At Ft. Eustis.
Most of the government folks I interacted with were Coast Guard, based either in our West Virginia facility or in DC-ish, but we exchanged some data with the Army Corps of Engineers for inland waterway stuff.
That's pretty cool, what's the origin story behind how you got into all of it?
Also, interesting that you made web apps. Forgive me because I might be being naive or ignorant, but did your users have internet access out in the ocean?
I think government IT contracting is a big part of this area's economy, though I didn't know that at the time.
I came to this area (eastern panhandle, WV) to work with a friend I met in college (at a place that makes software for libraries). After a few years at that company I wanted a change, and through the grapevine heard that someone else I sort of knew from college was looking for a developer at "the Coast Guard" (actually a facility staffed by mostly contractors).
I agonized over my resume for a couple weeks and eventually sent him a copy. This guy was working there as a developer and technical lead. Over email we talked about what he was looking for. At the time, the main thing they were looking for was maintenance on a C++ server program, and I had a little experience with C++. I think my background with Java helped, because they had other Java stuff I ended up working on later but I don't remember if it came up at the time. I went in for a short interview with the manager and another senior dev, mostly just questions about what I had worked with and what I was interested in working with.
The next day the HR department called back with an offer. I slept on it and accepted the next day. Then they said I should wait to give my notice until my interim security clearance went through. That took a week or two. I've heard they often take longer, but I think I may have already been in the system due to a government data entry job I worked one summer during college.
That might have been more detail than you were looking for :)
The sailors on the ocean weren't who our users were. The Coast Guard runs a network of receivers, and my team's job was to store the data from those receivers, and make it available to (mainly) government clients. So there was one web app for Coast Guard "watchstanders" to monitor the status of receivers, so they could respond quickly and troubleshoot it if one had a problem. And our other end users were organizations (such as law enforcment or VTS, which is like air traffic control but for a seaport) who would get a live feed of this data, which they would get by connecting a (Java) client we provided to our (non-web) server that would feed them a stream of this data. Then later a web app for (shore-side) coast guard users was developed that let them query our data for things like vessel status, location, and destination. People could also make requests for an archive of a certain period or area of this data, either through FOIA or through another Coast Guard unit, which I think all got e-mailed to our DBA.
I once heard someone who worked on the New York City subway system talk about how they used data from the number of rotations each wheel made in order to track certain data, but over time the wheels would wear down at different rates, so the wheel sizes would change over time. They had to hack around this and make all sorts of workarounds.
Do you have any of these kinds of stories about hacks to make the data tracking work in a real-world scenario?
So, the AIS protocol has messages that are each a string of bits. Sometimes there will be a selector number that's a few bits long which tells what fields make up the next block of however many bits of data. In some cases, that number is after the bits it describes. Which is fine, though I wished I had learned that before I started writing a parser, for a feature I was adding not long after I started there. Fine... as long as the described block is a fixed number of bits.
When we started adding transmit capability (this was several years later), there were some new types of messages we had to generate (actually sub-messages called "Application Specific Messages", but same principle). In the case of two of these message types, the thing that described which type it was came after the block, and the block was a different size depending on which type it was! And there was some other thing that made the whole message the same length. Which meant that whatever was parsing these kinds of messages would sometimes misinterpret them as the wrong type. So I ended up doing binary math to figure out that the way around this was that some auto-generated field (sequence number?) had to always skip number 48 or something like that.
Whoa. What stack was used for the web apps?
Any crazy stories you'd like to share from that experience?!
The main monitoring one that existed before I got there was in C# with ASP.NET. Initially all our databases were MS SQL Server but we switched to Sybase, in this case SQLAnywhere.
There were a couple apps that might almost count as "serverless", which had frontends in JQuery + JQueryUI. They talked to Java services on the backend through an Enterprise Service Bus and a REST API gateway that used Apache Camel, I think. (The ESB itself was run by an infrastructure team, and I don't really think it was that useful over just having REST services, but higher-ups liked to hear we were using it.)
There was one that I wrote, which they let me write in Python/Django (though there was a lot of skepticism of Python for applications at that point). It used Matplotlib to make nice graphs and so-so geographic heat maps. I might have mentioned this app before. I had some trouble with the Django support for Sybase databases. [Edit to add: The number of people using Django with SQLAnywhere could probably be counted on one hand with fingers left over. Less now that this app has been shut down, or so I hear.]
There was one time when we were testing transmit capability (which involved one of the JQuery apps on the front end), and Canada got mad because we were apparently spamming them. I don't know how the complaint came in, but we had to be a lot more careful about transmit tests after that.