A few days ago, a client’s data center "vanished" overnight.
-
A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.
I then suspected a power failure, but the UPS should have sent an alert.
The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.
To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.
The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.
That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.
The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.
The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.
Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.
Never rely only on internal monitoring. Never.
@stefano Only in BSDcafé can you read actual techno thrillers like this.
-
@stefano Only in BSDcafé can you read actual techno thrillers like this.
@EnigmaRotor Sometimes the lights are low and the atmosphere is dark...
-
@EnigmaRotor Sometimes the lights are low and the atmosphere is dark...
@stefano Stefano Jones P.A. a very noir series.
-
@stefano Stefano Jones P.A. a very noir series.
@EnigmaRotor /me making coffee in the dark, while whispering some IT horror stories
-
@EnigmaRotor /me making coffee in the dark, while whispering some IT horror stories
@stefano Oh, if genre is horror, then don’t forget to tell the tale of the guy who pronounced “Microsoft” 3 times before his mirror. What happened next, the blue mirror of death, is frightening to the bones.
-
undefined oblomov@sociale.network shared this topic on
-
A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.
I then suspected a power failure, but the UPS should have sent an alert.
The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.
To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.
The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.
That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.
The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.
The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.
Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.
Never rely only on internal monitoring. Never.
@stefano feeling of :xkcd:`705` intensifies :D -
A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.
I then suspected a power failure, but the UPS should have sent an alert.
The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.
To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.
The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.
That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.
The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.
The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.
Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.
Never rely only on internal monitoring. Never.
In the first sentence you mention a "data center", but such an attack would not work with a data center, to be one you need to have two buildings with independent power supply, at a safe distance, etc etc. I think this was at best a hosting room, not a data center.
-
In the first sentence you mention a "data center", but such an attack would not work with a data center, to be one you need to have two buildings with independent power supply, at a safe distance, etc etc. I think this was at best a hosting room, not a data center.
@uriel sure - we tend to call "data center" a specific place, inside the company, that will host the servers (with A/C, etc). Maybe a little inappropriate, here.
-
A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.
I then suspected a power failure, but the UPS should have sent an alert.
The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.
To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.
The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.
That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.
The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.
The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.
Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.
Never rely only on internal monitoring. Never.
@stefano I must repeat this Never trust in onsite backups either. Fire will destroy those. And RAID is not backup.
You know this but it bears repeating! -
@uriel sure - we tend to call "data center" a specific place, inside the company, that will host the servers (with A/C, etc). Maybe a little inappropriate, here.
Well, not "a little". The one you described is - at best - a server room, not even a hosting center, since according with the blueprints, there was no redundancy....
-
undefined stefano@mastodon.bsd.cafe shared this topic
-
Well, not "a little". The one you described is - at best - a server room, not even a hosting center, since according with the blueprints, there was no redundancy....
@uriel You're right. I've updated the original post to clarify it. Thank you for pointing it out!