Skip to main content

Transatlantic Communications Cable Doubles As Ocean Sensor

2 months ago
alternative_right shares a report from Phys.org: Monitoring changes in water temperature and pressure at the seafloor can improve understanding of ocean circulation, climate, and natural hazards such as tsunamis. In recent years, scientists have begun gathering submarine measurements via an existing infrastructure network that spans millions of kilometers around the planet: the undersea fiber-optic telecommunications cables that provide us with amenities like Internet and phone service. Without interfering with their original purpose, the cables can be used as sensors to measure small variations in the light signals that run through them so that scientists can learn more about the sea. Meichen Liu and colleagues recently developed a new instrument, consisting of a receiver and a microwave intensity modulator placed at a shore station, that facilitates the approach. Their work is published in Geophysical Research Letters. Transcontinental fiber-optic cables are divided into subsections by repeaters, instruments positioned every 50 to 100 kilometers that boost information-carrying light signals so that they remain strong on the journey to their destination. At each repeater, an instrument called a fiber Bragg grating reflects a small amount of light back to the previous repeater to monitor the integrity of the cable. By observing and timing these reflections, the new instrument measures the changes in the time it takes for the light to travel between repeaters. These changes convey information about how the surrounding water changes the shape of the cable, and the researchers used that information to infer properties such as daily and weekly water temperature and tide patterns.

Read more of this story at Slashdot.

BeauHD

CodeSOD: Just a Few Updates

2 months ago

Misha has a co-worker who has unusual ideas about how database performance works. This co-worker, Ted, has a vague understanding that a SQL query optimizer will attempt to find the best execution path for a given query. Unfortunately, Ted has just enough knowledge to be dangerous; he believes that the job of a developer is to write SQL queries that will "trick" the optimizer into doing an even better job, somehow.

This means that Ted loves subqueries.

For example, let's say you had a table called tbl_updater, which is used to store pending changes for a batch operation that will later get applied. Each change in updater has a unique change key that identifies it. For reasons best not looked into too deeply, at some point in the lifecycle of a record in this table, the application needs to null out several key fields based on the change value.

If you or I were writing this, we might do something like this:

update tbl_updater set id = null, date = null, location = null, type = null, type_id = null where change = @change

And this is how you know that you and I are fools, because we didn't use a single subquery.

update tbl_updater set id = null where updater in (select updater from tbl_updater where change = @change) update tbl_updater set date = null where updater in (select updater from tbl_updater where change = @change) update tbl_updater set location = null where updater in (select updater from tbl_updater where change = @change) update tbl_updater set type = null where updater in (select updater from tbl_updater where change = @change) update tbl_updater set date = null where updater in (select updater from tbl_updater where change = @change) update tbl_updater set type_id = null where updater in (select updater from tbl_updater where change = @change)

So here, Ted uses where updater in (subquery) which is certainly annoying and awkward, given that we know that change is a unique key. Maybe Ted didn't know that? Of course, one of the great powers of relational databases is that they offer data dictionaries so you can review the structure of tables before writing queries, so it's very easy to find out that the key is unique.

But that simple ignorance doesn't explain why Ted broke it out into multiple updates. If insanity is doing the same thing again and again expecting different results, what does it mean when you actually do get different results but also could have just done all this once?

Misha asked Ted why he took this approach. "It's faster," he replied. When Misha showed benchmarks that proved it emphatically wasn't faster, he just shook his head. "It's still faster this way."

Faster than what? Misha wondered.

[Advertisement] Picking up NuGet is easy. Getting good at it takes time. Download our guide to learn the best practice of NuGet for the Enterprise.
Remy Porter