It was a way to keep my skills up to date so that I could write about the technology more effectively. For me specifically, it was a way to stay familiar with Microsoft technologies at a time in which I was no longer implementing them professionally. Doing so was pragmatic in a number of ways: For those working in IT, this has always been a good way to keep up with the technologies one was using at work or soon would be. Like many of you, I spent many years trying to duplicate the Microsoft corporate infrastructure we’re familiar with at home. Naturally, this strategy evolved over time with the technology. And while I have strayed from time-to-time, I’ve always believed in and preached the importance of maintaining a solid backup strategy. Those experiences have guided my subsequent interactions with data backup across a wide range of local and cloud solutions over time. And in discovering this horrible reality, I found restore religion too. Restore is the other side of the backup coin, of course. In doing so, I discovered that the tapes we had backed up to were unreadable. A few years later, when we dismantled that web server-a $5500 Dell with a Pentium Pro 166 processor-I decided to use the tape backup elsewhere. The final irony here is that the tape backup we bought never actually worked. Hey, it was the 1990s.īut that is only part of the story. And as part of our fix, we finally did what we should have done in the first place. A trip to Fry’s Electronics and about $2000 later, we were able to fix the problem and do so, delicately, without losing any data. We slept, awkwardly, in our boss’s VW Jetta when we finally realized at some awful hour of the night that we needed more parts. Long story short, a co-worker and I spent the entire night there troubleshooting the server. And I needed to fly out immediately and race down to a co-location facility in San Jose to fix the problem. Our one web server, the only source of all of our data, had come up lame. ![]() I was working in my second IT-related job, as the webmaster and a jack of all tech trades, for a small San Francisco Bay-area start-up. Said Alexey Serkov, CTO, CloudBerry: "It's very low-cost and yet provides access to data ins, not minutes or hours, so when customers need to restore, they don't need to worry about excessive downtime while they wait to retrieve the backups.My come to religion moment with backup happened in the late 1990s. And for CloudBerry's MSP partners, cloud Storage Nearline is integrated with all other Google cloud Platform services and uses the same unified API. Cloud Storage Nearline provides a 99% availability SLA (service level agreement) and a throughput guarantee of 4MB/s per TB.įollowing the launch of Google Cloud Storage Nearline in March 2015, CloudBerry Lab announced support for this new category of cloud storage service through its CloudBerry Backup family of solutions.įor SMBs, this collaboration gives CloudBerry customers access from day one to a low-cost target that's well suited for backup storage. Google claims response times at approximately three seconds and pricing at 1 cent per GB/month. Google Cloud Storage Nearline is a low-cost storage service for data archiving, online backup, and disaster recovery. Applications are typically for cold data storage of infrequently accessed data and disaster recovery. Nearline Storage is a better choice over Standard Storage in scenarios where slightly lower availability and slightly higher latency (typically just a few seconds) is an acceptable trade-off for lowered storage costs.
0 Comments
Leave a Reply. |