The State of My Private Cloud in 2019

I have been maintaining my private cloud network powered by Resilio Sync for a few years now. I have talked about it before. See this search for all those posts. When I built the original version of my private cloud the intentions were for it to provide a 321 backup solution for my stuff. The effort involved in maintaining the system turned out to be more time involvement than I would like. Overall even with more work than i thought it still has been largely a success for me.

At the time when I built the network my intention was to use Raspberry PI’s as my remote nodes. As my use of the system evolved that stopped being a viable solution. One of my first Raspberry Pi remote nodes had to be replaced. The drive i deployed just wasn’t big enough. That wasn’t a Pi specific issue. The next thing that happened was I ran into significant challenges around the amount of memory available on the a Pi II. Resilio would crash the Raspberry Pi. The reason was the app would consume all of the available memory until the OS froze. I had the same challenge on my Synology disk station at one point. That was fixable with a $15 4 gig memory upgrade. I was not able to do anything like that with the raspberry pi II.

To work around the limitations of the Raspberry Pi 2 was that I bought more powerful and thus more expensive computers. The two remote machines that I had running were fanless zotac z-boxes. They were great. The only downside was the cost that was significantly more than a pi. I bought a low-end Celeron version of the Zotac for around $150 plus memory and drives. The costs were about 4 times as much as a similar Pi 2 setup. At the time I had no good alternatives.

Then someone at work put me onto buying a Hardkernel ODROID-HC1 that was designed as a personal cloud type machine. It came with a case to put an internal hard drive in. The beauty of these machines were they had two gigs of memory and were not that much more expensive than a Pi 2 at around $50. I think I maybe spent $70 including memory card etc, not counting the hard drive. The hard drive was an internal one so cost to get one was cheaper than using an external one for the PI.

I purchased two ODROID’s within a year. One was at a friends house. The other was replicating data at home. I had problems with what I think was corruption of the OS on the SD card on both machines. The remote host had to be rebuilt twice. By the 3rd time it had a problem I gave up. I just didn’t want to spend the time troubleshooting it. I’m not sure why they continued to get corrupted. I still have one of them at home that has been pretty stable this year. I gave the remote one to my friend who hosted it for me. He was going to see if he could use it for something. The ODROID was a good idea however it did not turn into a long-term solution for me.

When I first started this private cloud project the public or consumer file storage services did not really offer zero knowledge encryption. The only service at the time that was financially viable for me to use was MEGA. I tried that out and it wasn’t seamless for me so I abandon a public cloud solution. I went with my private cloud. Today there are a few service providers that cater to people looking for zero knowledge encryption for remote storage. There still aren’t a lot of them however I was glad to see the landscape had evolved since I started this project.

I’m not sure what triggered my research into public clouds again. I started looking at what the cost benefit would be to go with a zero knowledge encryption public cloud provider instead of continuing to build my own network over last summer. I found a provider I liked, Tresorit. They ticked all the boxes for me on what I was looking for. The challenge was for 2 TB monthly cost over £20 a month. There only cheaper solution was not enough space for my needs.

When calculating the lifecycle of the hardware I buy for my own private cloud network versus the service costs of the provider it’s probably cheaper to keep doing it myself. Originally that was not true. From when I started this investigation in moving to a service provider until today there was a change in what kit was availible. The Raspberry Pi 4 came out. Having a need to replace the ODROID and possibly one Zotac at a minimum in the next 3 years would have been several hundred pounds. The Pi 4 was clocks in for the 4gig model at around £60 for the computer and all the accessories I needed minus a hard drive. I am recycling a hard drive so there is no additional cost there. When they announced the latest pi4 I immediately put in order for one of the 4gb models. My hopes were that it would perform well enough to use in my private cloud network. On paper it solves the memory usage issue of the Pi 2 & 3.

At the time of writing this I have had my first Pi 4 running in “production” for almost 3 months. The software has been pretty stable. I am running it within a docker container on a Pi 4. So far the system is consuming way less than 50% of memory. Ussually somewhere between 1 to 1.5 gig. One of the other clean up things i did was consolidate the many shares I had into 5 total shares. The Pi replicates 4 of them.

With the extra space i have on a remote node can also take local copies of the replicated data on that remote machine. That should complete my 321 backup strategy. Since I want to add extra resiliency into my plan I will continue to take annual point in time offline copies of most of my data.

Since I am reusing hard drives right now (i over bought on size I needed on the last upgrade and the drives are great) that means i can get another Pi 4 for £60 pounds and have a refreshed pair of remote nodes. I continue to use my Synology, my laptop, and a Linux server for the other nodes at home.

My costs this year are on target to be £60-£120. That is half the price of one year of cloud storage service. The new machines should give me 2 to 3 years of service easily. Especially since I’m deploying them with 5 TB drives and I’m only using about 1.3 TB for what I’m backing up today.

I am pleased that the build my own system is cheaper and continuing to work out vs the public cloud option. As long as maintaining the system is not a lot of trouble I picked the right option.

A Test of The HorcruxNet

When explaining my Resilio personal cloud setup to someone at work they replied that i have my own personal Horcrux minus the killing.  I liked the idea so I think i am naming my Resilio backup setup HorcruxNet.

The network is having its biggest test this week in its 3 or so years of existance.  I am moving.  Movers are packing up our stuff tomorrow.  That means i am putting my electronics into “Cleaning Lady Safe Mode”.  it is what i used to have to do when our cleaning lady showed up.  I would unplug everything so she didn’t mess stuff up.  She did a few times.

I have expanded my network to have replica or partial replica copies on my laptop.  I also have 3 working remote sites thanks to friends and family hosting some nodes.  While we move and my two primary full nodes (my Synology) and mac mini) will be offline for about 4-5 weeks.  During that time my remote hosts will hopefully keep humming along.  With my home network offline i doubt there will be much changes however since my laptop has a partial replica if i do make changes it will propagate out.

I love a nice well configured computer system if I do say so myself.

BitTorrent Syncy Network Phase II

Since around the new year and trying to figure out what next phase of my private cloud backup network would look like. The design was originally leveraging several raspberry pi’s however practice only one remains at a remote location. The remaining remote locations I’m using old Mac Minis. Even the one Pi I do have deployed is inoperable and needs to be rebuilt. I’m not sure if it’s the Rasbian version of BitTorrent Sync are generally anyone expelled for sync but I had several problems with the Rasbian  installs losing their license identity. What happens is I then have to re-add the BitTorrent Sync pro license and reindex all my shares on that node. It’s annoying and I’m concerned though because option at some point.

At the same time I’ve been looking at ways to better secure the remote data. All of my systems are at friends and families houses so endpoint security’s been less of a concern then on the network security however when BitTorrent Sync announced encrypted folders I was extremely curious. After playing around with it for a while I have opted not to use the encrypted folders however it’s something I’m still thinking about for the future.

On a side note I’ve been contemplating a Lenox desktop to complement my Mac. I shopped around and found a nice inexpensive Zotak. I picked one up and put a 120gig SSD and eight gigs of RAM in it. I wasn’t sure if I was use it as a desktop or to replace my BitTorrent Sync Pi. Right now I’m having keep a replica copy of my data at home to test it out. I’m currently running go to 14.04. It’s been running pretty well however I did have one or two anomalies with sync folders so I am not yet ready to deploy it in the wild. My goal in a long-term is to replace all the minis with something like the Zotak. The new boot to install also increases that new to 4 TB of storage versus the raspberry Pi I have deployed with a 1 TB hard drive.

As part of my incremental upgrades I have put a 4 TB drive in one of the remote minis and the other has a 2 TB drive. That gives me some headroom since fully seated backup is around a terabyte.

Holding off on any additional drive upgrades until I can confirm that the Ubuntu based Zotak is working well. If it is I hope to pick up another one two.

The raspberry Pi’s are not going to go to waste. I’m using one of them has an extra replica copy at home for BitTorrent Sync. I have a higher tolerance for failure for that since it’s an extra copy at home of my data. I’m installing some applications on another one. Future projects for the remaining ones include a possible reverse proxy, Wi-Fi hotspot and or WebCam. I just need to find the time for all these projects. For now just want to finish my backup solution upgrade.

And if you’re reading all this thinking yourself do I really need a four node private cloud network the answer is of course not. The other answer is it’s really six nodes if you count the three I’m running at home. In the end I didn’t save the money using a private cloud  since even though a lot of the equipment was lying around there was some upfront costs that I won’t realize unless I use the system for 2 to 3 years. The reason to do it however was more because I can and because I wanted control over my information. I’m glad I’m continuing to tinker with this since I’m learning a lot and a lot of fun.

Pi Net Upgraded

I have been challaged with figuring out an easy way to upgrade the Bittorrent Sync software I use to replicate my files on my Raspberry Pi’s.  The upgrade process on my mac is super easy.  The upgrade process on Raspbian should be easy however the person who has been maintaining the easy to upgrade package hasn’t updated his package in 9 months.

The internet is a small world.  Every single how to i find on the internet about installing Bittorrent Sync on a Pi (I found at least a half a dozen) point to this one repo.  I found instructions on how to setup everything manually however it was not a simple process.  Right as I was reviewing the steps to do the manual install I stumbled accross someone who forked the original package and is maintaining the latest update of the sync software.

After testing the install process on a clean Pi, I ran an upgrade on one of the two Pi’s I currently have at home.  The upgrade worked great.  I just ran the upgrade on the other local Pi and then on the remote units with no major issues.  This new repo says they are in sync with Bittorrent Sync’s release schedule by about 24 hours.  Hopefully I will be able to stay up to date from now on.

I am hoping my upgrade (from 2.0.94 to 2.2.7) solves some minor bugs I have been experiencing with one location.

This was the first major upgrade to all my nodes on my private cloud.  I was a bit cautious however in the end the system didn’t get corrupted and functioned as it should.

Next up I am thinking about another remote node location.

Pi Net is Live

At 19:28 Local time today My Pi Net node Epsilon came online at my friends house and started syncing with the rest of the network.  This is the first remote node in my private cloud network. This note was built using a raspberry pi 2 and a 1 TB USB hard drive that I had lying around. For my data replication I am using BitTorrent Sync 2.x. In order to get the Pi working I had to learn a bit of Linux. This is the first major milestone in my project to ensure my personal data is backed up offsite from my apartment using a secured private cloud and not leveraging any potentially insecure public clouds.

Next up in building out this network will be a second off-site location. I need to finish setting up second Pi and have that node live to have the network be complete.

My Personal Private vs Public Cloud Debate

I have been pondering my 321 Backup strategy for several months now.  Even before I had a near catastrophic issue with my Sinology DiskStation back in April I knew I wanted a more robust data management plan for my personal files.  I had been using the Sinology Cloud Station software but in my original configuration I was limited since all my data centered around my apartment.  This is convenient but not the safest approach.  I also only really had 2 copies of most of my data.  A really rock solid strategy has 3 copies across 2 sites, and if possible different media types.  I knew i wasn’t doing things good enough.

One solution I toyed around with was BitTorrent Sync.  Back in March I tried it out and had big problems with the UI on my desktop and the web interface on my DiskStation not loading regularly.  I didn’t feel the solution was ready for me to use yet.  I wasn’t confident in the Cloud Station software anymore since I ran into a data integrity issue with it in March.  I had uploaded photos I took in march and noticed that they hadn’t gotten to my Diskstation when they were saying they were syncing off my Desktop.  To make things more confusing the files were syncing to my laptop but not the Diskstation.

That problem lead me to the BitTorrent Sync option and the decision to upgrade to a beta version of the Synology software.  That in turn lead me to some problems I had an my near fatal event with the Diskstation.  So basically in a mission to solve my backup strategy problem i caused an event that a better backup strategy would have solved.

My near loss of all my data on the DiskStation was my wake up call that I needed to really figure some working solution out.  After I recovered (barely) from that incident I focused large amounts of time on solving data strategy.

After my trust was shaken with the Cloud Station data integrity problems I experienced I began to look at public cloud solutions.  I am very concerned about security so I discounted many public cloud providers.  Dropbox, and pretty much any American based solution is just not trustworthy with todays laws.  Dropbox employee’s can even get into your files if they need to regardless of the safeguards they claim to have I do not want anyone being able to get into my stuff unless I let them in.  That security concern lead me to MEGA.  They weren’t US based and they don’t have a way into your data.  Their plans were more expensive than most cloud providers however their 2TB plan more than covered my needs.

One downside was that MEGA didn’t have any built in backup solution for the Diskstation.  That meant I could backup my data to the cloud but the Diskstation wouldn’t be my main data source.  I could still use it as a home server but not the home of my data.  I wasn’t really pleased about that however I couldn’t really find an alternative that worked.  So earlier this spring I jumped into using MEGA as my cloud sync / backup solution.  It had a lot of what i wanted but it wasn’t perfect. It was the best of what was available that worked for me.

I spend a few weeks seeding and confirming my data was correct in MEGA before trying to make a sync copy onto my Laptop.  That was when I ran into issues.  What i noticed was after a while syncing files I would get to a point were the MEGA agent would freeze.  Sometimes after only 70-100 files downloaded.  I would restart the agent and it would do the same thing and copy a few more files and then stop.  I couldn’t really figure it out.  I tried reinstalling, putting the agent on another machine, checking their online help but i was unsuccessful at finding a solution.  This was an issue on top of a general annoyance I had were it would take 5-10 minutes for my admin page on the MEGA site to load.  I don’t know what it was doing but in any browser (chrome, firefox, or safari) i would have the same issue.  that wasn’t a show stopper however add to it that i can’t download a replica of my data onto my laptop I was very concerned.

After a week of tinkering I gave up and had no choice but to revert my decision and go back to Cloud Station software by Synology.  I didn’t want to bother calling MEGA.  I had a confidence issue and even if they could fix the issue in 10 minutes with my level of technical knowledge and online tools i should be able to figure it out.  If i can’t i have concerns about their service.  It wasnt like my problem was complex.  The system just stopped syncing data after a few files.

I wasn’t pleased however MEGA was never the perfect solution for me.  I knew that going into it but thought i could make it work.  In the end I couldn’t so I went back to Diskstation as my primary data store and used Cloud Station to sync.  I kept an eye on the data i synced to make sure i didnt have a repeat issue.  My plan to build another DiskStation and leave it with a friend was back on the table.  That was until about a week ago.

I don’t know what got me to look at BitTorrent Sync again however I installed it again.  I knew they were actively releasing newer versions so I had hoped that what happened the last time was an issue that was solved after a few months.  I was pleasantly surprised to see after some brief testing that the UI display issues appeared to be solved.  I slowly over a few days turned off syncing via Cloud Station and enabled BitTorrent Sync.  I made backups before my changes just in case.  The UI consistently worked on the desktop.  For the web interface on the DiskStation what i learned was first safari wasn’t so great for it.  Second that clearing my cookies for the site typically solved the issue.  With that issue resolved I moved most of my shares over to the BitTorrent Sync app within a week.  I was originally going to try out the system for 30 days and then decide if i wanted to pay for the Pro version.  After going over the 10 share limit for the trial i opted to pay for the 1 year Pro for $39.

As of right now I have my desktop, DiskStation NAS and laptop all replicating data with the new Sync software.  I am at a steady state like were i was back in March.  This steady state took a lot of research and trial and error however now is the harder part.  Now I need to finish this project and meet my objective of a true private cloud with data located in multiple locations.

Using BitTorrent Sync gives me a few options I didn’t have with the Synology centric Cloud Station.  My remote backup locations do not need to have a DiskStation.  I have two old Mac Mini’s that I could provision with external hard drives and drop them in remote locations.  That was my original idea at least. Now I am thinking bolder and cheaper.  If I am successful I will have a prototype to talk about soon.

Raspberry Pi Cloud Node Prototype

The first phase of my BitTorrent Sync project is mainly complete.  I now have the Sync software running on my Diskstation, Mac Mini desktop and my Macbook Pro laptop.  I have replaced my Sinology Cloud Station app and all files are kept up to date using BitTorrent Sync.  That change gets me to a point were I was before with Cloud Station.  I have access to all my files everywhere however I do not have a complete backup solution since most of the data is only current in two places.  Those places are 15 feet from each other and not really giving me much disaster recovery.

The next phase of my backup / cloud strategy is to have offsite copies of my data or large parts of my data in case of anything happening at home.  I originally planned to do this with a 2nd Sinology Diskstation somewhere.  The costs were very high and that solution limited me to one other location.  When I was thinking of using Cloud Station software from Sinology that solution made sense, however now that I got BitTorrent Sync to work I have other options.
Raspberry Pi & Drive

is bit torrent sync my answer to sync situation