Yes Your Internet Provider Can and Might Be Spying on You

In late March Congress repealed regulation that the FCC set up that prevented Internet service providers from collecting and selling information about their customers without their consent. Rightfully many people are pretty upset over this. Security blogger Brian Krebs points out that this repeal changes nothing day today. That is because as of right now the rules that were repealed never actually took effect yet. I would go a step further and say if someone is only now concerned about this issue they likely won’t take the right steps to protect themselves anyway. 

I applaud people’s concerns. They should be concerned. That being said several people have recently asked me questions about VPN setups. That might solve issues regarding your ISP collecting data about you however it does not prevent all the other companies that are collecting data about you.

When I talk about this topic with anyone I always recommend that they watch the documentary Terms and Conditions May Apply. I’m not sure how many of my friends had actually seen the documentary. It’s a disturbingly fascinating view of how your information is being collected. Thanks to my friend Andrew who pointed his documentary out to me last year.

I just finished reading The Art of Invisibility by Kevin Mitnick. I previously wrote his book the art of deception and liked it a lot. In the art of invisibility Kevin goes over the details of what you would need to do to become invisible online. In the end there’s no way I’m going to take all the steps necessary to do that. It was disturbing just to read the extent of what you would have to do in order to become truly invisible. For me I outlined in a previous post some of the steps I do to minimize my exposure.

When people ask me about what VPN provider to get or some other way to secure themselves online the question I usually ask is what is their threat model? What’s the problem they’re trying to solve specifically? I have  a few threat models depending on the situation for my online behaviors. I know that I am light years ahead of what most people do however I’m also aware there are several key improvements I need to make in how I use the Internet.

I use a VPN however I don’t use it as often as I would like to. When out of my apartment I try to use it all the time unless I’m at work on my work equipment. At home I have set up my router to tunnel everything through the VPN. The challenge is I don’t use it. I have a consumer router running an open source firmware. It suffers from the same problem all other consumer routers do, it has a relatively lightweight CPU. When I run a VPN client from a computer of mine I may get near line speed of what I would get without the VPN. When I run the VPN the my router I was getting 4-8 times slower connection. This is all due to CPU constraints on the router. 

To solve this problem I need to either by a commercial grade router or build my own using a computer. I’m going opt to use a low-end Zotak fanless  computer and build my own router. One of the guys at work pfsense. It looks pretty good and I’m going to give it a try. Now I need to just find the time to work on it.

My recommendation to my friends is yes get a VPN. Preferably one incorporated outside of the US.  I personally have been using NordVPN for over a year and have been pretty happy with it. I have recently been trying out AirVPN.. They have less options for entry points in the US however they offer some unique features with their VPN client. I also like  the history of the organization and why they became a VPN provider.

I also recommend if you’re serious about your privacy to read one of the books I suggested or just watch the movie. Most people understand that stuff they’re doing online is being tracked however I don’t feel like most of my friends or the general public truly understands the extent at which you are being tracked.

The Ever Increasing Complexity of Securing My Personal Information Online

Do you know how many online accounts you’ve created? How many of those have personal information that could be exploited or sold? According to lastpass I have approximately 350 Online account profiles created. The exact number of those that have personal identifying information such as my name, address, email, or even credit card I’m not sure. I am guessing out of all those maybe 1/3 to 1/2 of those sites require a physical address and maybe a credit card or some sort of payment information. In this day and age when Yahoo has at least two or more compromises to their security I personally cannot trust random institutions on the Internet to keep my information safe.

I’ve had this conversation with many people and depending on the audience i am considered a tinfoil hat crazy or just a determined realist. Either way the state of the Internet today where many sites require registration is such that I am concerned about the level of personal information I’m trusting with people that frankly don’t deserve or have not earned that trust.

There is no one simple fix to this challenge. I have taken a multileveled approach to addressing the situation depending on my use of a particular website.

For websites requiring a name and email address I simply provide an alternative name as well as either a unique email address I can destroy as needed for generic email address that I periodically destroy. If I create a unique address I can simply destroy the address when I no longer need that website. The process of creating an address does take a few minutes however so for one off sites that I need to register with and don’t intend to use again addresses that I delete every few months. That helps me reduce any spam.

The above solution only works for websites that do not require payments of any kind. Things get complicated when you start dealing with websites requiring some sort of payment method. To limit exposure I’ve used a few different options depending on the situation.

The simplest solution is when I have a service provider for a website that provides digital goods and they accept Bitcoin. In this scenario I would need to provide any personal information or any reasonable payment information. The challenge here is the number of websites that offer Bitcoin payment options are limited. One example of this use case would be my VPN provider.

The next area would be a website that does not offer bitcoin however I still need to pay for services that do not require shipping anything to me. In this case I would look to use PayPal when possible since none of my personal information are stored with the website only on PayPal systems. If that’s not possible I will use a real credit card. For recurrent purchases as of now I’m currently stuck and need to continue to provide my real information and a credit card. For nonrecurring services I will use blur. Blur is a service that allows me to buy prepaid credit card. What is unique about this service is that it allows me to use their address and any name you want on the virtual card. It’s also completely virtual so you can use it as a one-off disposable credit card number. I’m trying to go back to websites where they require credit for details however I don’t shop with them at all anymore or often. I replace any valid credit cards with one of the disposable ones from Blur. It requires a lot of effort however update the site or two here and there when I think I have a few minutes to spare.

One of the challenges with blur is that in some cases I have had issues validating the credit card. It’s hit or miss so I’d like it to be more reliable however it’s still a good choice to use I no longer want my personal detail shared however the account on the site cannot be canceled. At that point filling in details not specific to my personal information is useful.

The most complicated scenario is when I need a real physical address to have something shipped to me. In those scenarios Apple pay, or PayPal is preferred. That way my details are not stored on a any websites systems. In recent months I’ve been surprised how many services do you offer PayPal however the majority of times it feels like I do need to provide my credit card information. In cases where I do have to give my real credit card details I will try to not create an account on the site. Many websites force you to do that however. In those cases I try to remember afterwards to go back and provide non-identifiable information in my profile. That way if the site is hacked all the have is my purchasing history and identifiable information that cannot be tied to me. I’m not as consistent in doing that as I would like to be however the past year and more diligent about cleaning who has personal identifiable information.

Within that last group of sites there are some that I frequently reuse. Under those circumstances I don’t have a choice at this time other than to maintain my personal information including credit card details with that website. In the case of someone like an Amazon I use two factor authentication however does not prevent them from being hacked in their database stolen. At present maintaining information on these sites is a risk I have to take if I want to use the Internet. All of what I described previously enables me to minimize the number of sites I have to trust with this information.

Even with all of these actions I’m not where I want to be with regards to personal information exposure online. I’m probably better off than 99% of the population however I know what specific actions I need to do to secure myself further. Now it’s just a matter of finding the time to go through the list of sites I’ve recorded that I’m registered with and make necessary updates. At the time of writing this I’m about 60 to 70% done. The challenges it only takes one site like the Yahoo breach to have bad things happen.

Securing Email Isn’t Only For Spies, Dissidents, & Journalists, Right?

Over the past year and a half I have been taking lots of steps to secure my digital life. I’ve written a lot about the different aspects of that. My migration from Google mail and other services to more secured options.

One thing I’ve known has been a concern that I’ve not yet addressed the quantity of data online. For example even though I moved my mail to a Swiss based provider I still had my entire email archive available. I have mail going back as far as 1997 I believe. I have been wanting to take that archive off-line and out my email provider’s servers. Over the years I’ve had the packrat mentality where I want to keep all of my messages. Recently I’ve grown to not want many of the messages I received. I’ve been deleting stuff that are unnecessary however there are still things that I get a do want to keep. In general I would like to keep the archive, especially my personal correspondence.

The challenge that I have is that I’m growing less trustworthy of any service provider. Even though my email hosting company is in Switzerland they take no extraordinary security precaution so the system is just as susceptible to hacking as most. That means my mail at rest is in the clear, unencrypted. But I want to do is take my mail and store it off-line so I have more control over it. I currently plan on keeping it in a local archive on my Mac at home. I will also have it backed up on my bit torrent sync network.

The first step in this process was for me to copy all of my mail to a local application. For my purposes I found the built-in Mac mail application to work the best. Once I had a downloaded copy of all the mail I was able to export it to an mBox formatted archive. At the same time I took the opportunity to recategorized how I organized my mail. In the past when I was using Google I had been using tags extensively. When I exported out of Google I went back to a folder structure where each high-level tag was its own folder where I put received mail. When I exported the mail to a local folder I put all sent mail in one folder and all received mail and another. Using mail tags I was able to continue to tag and make smart queries of the male if I ever needed to get a hold of the categories that I used in the past.

Once I had the off-line mBox files I put them in an archive on my BitTorrent Sync network. I kept the live copy in my Mac mail on my computer in case I need to search for and email in the archive. Over the past few weeks after I’ve done this I’m surprised how often I do go back and reference old emails for things like key codes or when did I buy something. After I was satisfied that the mail was backed up I deleted it from my hosting provider.I did leave this calendar year’s mail on my hosting provider. I figured that was a good round number to keep online. I can annually do an archive. Having to be at home or to remote into my home computer to perform mail queries has become a slight inconvenience however it hasn’t been the end of the world.

In addition to moving my entire mail archive off-line I want go further and start using a secured email provider like proton mail that takes extraordinary steps to encrypt the data at rest.I do not need that level of security for all my mail however does come in handy for some of it. There’s been several messages I’ve been hesitant to send or had no choice but to send that contain sensitive information such as bank information or Social Security numbers in the past that I would prefer not to use via email. And of course that’s not my paranoia security experts say never do that. Having a secured provider that encrypts the mail at rest and also has mechanism for sending secured mail to others could be useful. Really what he secure mail is doing is it sending email to the recipient with a link back to the website that secured that contains the actual message. I need to provide a password hint in the body of the mail I send. It’s not perfect however in most cases it will solve the problem of sending outbound secured man.

One of the challenges in a system such as proton mail is that at present there is no mechanism to import or export mail. That means anything I receive is locked into that system. On day one that’s not a problem however I like to have data portability. Protonmail says they are working on that function however who knows when or if it will ever come to pass. I may still use them for some correspondence only and in essence had two private email addresses one for security and one for unsecured messages. That way I can route one I want secured to the encrypted system.

I’ve also been looking at Tutanota as an alternative to proton mail. It appears to have the same import and export limitations however otherwise seems like a very similar and comparable option. Both systems offer a free tier.I signed up for both services to play around with them. I’ve since signed up for a month-to-month service with both of them and them in the process of pointing in unused email domain to Mutant, while I’ve already completed setting up proton mail. Protonmail so far seems like a slightly better option in terms of usability however it is significantly more money per month than Tutanota. The only reason I signed up for the paid version of Tutanota after I signed up for Protonmail was because it was less than two dollars a month.I hope to give both services try for a month or two before settling on one or the other.

For now the combination of moving my mail off-line and having a encrypted provider as needed suits my needs. These changes are all still pretty new so I will see how things pan out over the next month or two before I decide to make any tweaks or to let the situation be as is for the time being.

That Time Were My Security Paranoi Might Pay Off in a Real World personal Scenario

In a recent post I wrote about how I had to wipe my Mac Mini at home due to a potential compromise in my chrome browser. The ironic thing with that issue was for months I’ve already started taking steps to minimize the chance of such an exploit. The problem likely began months earlier and didn’t present itself until recently however the damage was already done. It just justifies the extreme measures I am taking taking in regards to securing my web browsing.

At a high-level my approach is isolating some but not yet all of my browser traffic to Linux virtual machine. I know that theoretically a virtual machine is not 100% isolated. I’m willing to chance using the virtual machine over booting into TAILS using a USB key. That level of inconvenience is not something I typically want to be bothered with and I feel that my current solution will be good enough.

Within the virtual machine I installed Firefox and chrome browsers as well as the TOR browser. I also configured open VPN to use my VPN provider. I then set up a visual cue i.e. a distinct background of the virtual machine to note that when I am using it I am in a semi-isolated system.

To protect the virtual machine from most exploits I take a snapshot about every month that includes the latest patch level for all the applications in the operating system. I do not ever use the virtual machine prior to that snapshot to do anything other thank update software or make base OS and application configuration changes I want to be persistent. Once a snapshot is taken I will use the virtual machine and then when I’m done I will revert back to that clean snapshot. I might not revert back to the clean snapshot after each use however I try to do it as often as possible. At minimum when I go to update the virtual machine I will revert back to the last known good “clean” snapshot and upgrade that. Then I’ll take another snapshot.

Late last year I implemented this solution using an Ubuntu 14.04 virtual machine. In April I built new ones using Ubuntu 16.04. Because I own a copy of VMware Fusion for personal use and a work copy of Parallels I have both virtual machine flavors of the operating system image. Other than a few minor tweaks with the new image the 16.04 version is mainly an operating system upgrade. I now have a “secured virtual machine” on all the main computers that I use day-to-day.

The solution isn’t perfect however as a first pass at this I feel that it gives me the best trade-off between additional security and ease of use. The VPN gives me some anonymity. TOR And VPN gives me more. The snapshot of the virtual machine decreases the chance that the system can be infected.

Longer term I want to build a dedicated machine for TAILS or Quibs. That solution would only work at home since I need a dedicated computer setup for it.  For now I will settle for the VM solution I have implemented until I am comfortable using it and able to accept the extra effort involved in a dedicated machine configuration.
What’s interesting or disturbing to me is some corporate executives and even government representatives (NSA labels Linux Journal readers and Tor and Tails users as extremists

BitTorrent Syncy Network Phase II

Since around the new year and trying to figure out what next phase of my private cloud backup network would look like. The design was originally leveraging several raspberry pi’s however practice only one remains at a remote location. The remaining remote locations I’m using old Mac Minis. Even the one Pi I do have deployed is inoperable and needs to be rebuilt. I’m not sure if it’s the Rasbian version of BitTorrent Sync are generally anyone expelled for sync but I had several problems with the Rasbian  installs losing their license identity. What happens is I then have to re-add the BitTorrent Sync pro license and reindex all my shares on that node. It’s annoying and I’m concerned though because option at some point.

At the same time I’ve been looking at ways to better secure the remote data. All of my systems are at friends and families houses so endpoint security’s been less of a concern then on the network security however when BitTorrent Sync announced encrypted folders I was extremely curious. After playing around with it for a while I have opted not to use the encrypted folders however it’s something I’m still thinking about for the future.

On a side note I’ve been contemplating a Lenox desktop to complement my Mac. I shopped around and found a nice inexpensive Zotak. I picked one up and put a 120gig SSD and eight gigs of RAM in it. I wasn’t sure if I was use it as a desktop or to replace my BitTorrent Sync Pi. Right now I’m having keep a replica copy of my data at home to test it out. I’m currently running go to 14.04. It’s been running pretty well however I did have one or two anomalies with sync folders so I am not yet ready to deploy it in the wild. My goal in a long-term is to replace all the minis with something like the Zotak. The new boot to install also increases that new to 4 TB of storage versus the raspberry Pi I have deployed with a 1 TB hard drive.

As part of my incremental upgrades I have put a 4 TB drive in one of the remote minis and the other has a 2 TB drive. That gives me some headroom since fully seated backup is around a terabyte.

Holding off on any additional drive upgrades until I can confirm that the Ubuntu based Zotak is working well. If it is I hope to pick up another one two.

The raspberry Pi’s are not going to go to waste. I’m using one of them has an extra replica copy at home for BitTorrent Sync. I have a higher tolerance for failure for that since it’s an extra copy at home of my data. I’m installing some applications on another one. Future projects for the remaining ones include a possible reverse proxy, Wi-Fi hotspot and or WebCam. I just need to find the time for all these projects. For now just want to finish my backup solution upgrade.

And if you’re reading all this thinking yourself do I really need a four node private cloud network the answer is of course not. The other answer is it’s really six nodes if you count the three I’m running at home. In the end I didn’t save the money using a private cloud  since even though a lot of the equipment was lying around there was some upfront costs that I won’t realize unless I use the system for 2 to 3 years. The reason to do it however was more because I can and because I wanted control over my information. I’m glad I’m continuing to tinker with this since I’m learning a lot and a lot of fun.

Security Paranoia?

Today when picking up T from School one of the other dad’s pointed out a picture of his son and T in the school catalog.  Is it a sad state of the world that the first thing I thought of and said was “Did I sign a release for that”?

Pi Net is Live

At 19:28 Local time today My Pi Net node Epsilon came online at my friends house and started syncing with the rest of the network.  This is the first remote node in my private cloud network. This note was built using a raspberry pi 2 and a 1 TB USB hard drive that I had lying around. For my data replication I am using BitTorrent Sync 2.x. In order to get the Pi working I had to learn a bit of Linux. This is the first major milestone in my project to ensure my personal data is backed up offsite from my apartment using a secured private cloud and not leveraging any potentially insecure public clouds.

Next up in building out this network will be a second off-site location. I need to finish setting up second Pi and have that node live to have the network be complete.

My Personal Private vs Public Cloud Debate

I have been pondering my 321 Backup strategy for several months now.  Even before I had a near catastrophic issue with my Sinology DiskStation back in April I knew I wanted a more robust data management plan for my personal files.  I had been using the Sinology Cloud Station software but in my original configuration I was limited since all my data centered around my apartment.  This is convenient but not the safest approach.  I also only really had 2 copies of most of my data.  A really rock solid strategy has 3 copies across 2 sites, and if possible different media types.  I knew i wasn’t doing things good enough.

One solution I toyed around with was BitTorrent Sync.  Back in March I tried it out and had big problems with the UI on my desktop and the web interface on my DiskStation not loading regularly.  I didn’t feel the solution was ready for me to use yet.  I wasn’t confident in the Cloud Station software anymore since I ran into a data integrity issue with it in March.  I had uploaded photos I took in march and noticed that they hadn’t gotten to my Diskstation when they were saying they were syncing off my Desktop.  To make things more confusing the files were syncing to my laptop but not the Diskstation.

That problem lead me to the BitTorrent Sync option and the decision to upgrade to a beta version of the Synology software.  That in turn lead me to some problems I had an my near fatal event with the Diskstation.  So basically in a mission to solve my backup strategy problem i caused an event that a better backup strategy would have solved.

My near loss of all my data on the DiskStation was my wake up call that I needed to really figure some working solution out.  After I recovered (barely) from that incident I focused large amounts of time on solving data strategy.

After my trust was shaken with the Cloud Station data integrity problems I experienced I began to look at public cloud solutions.  I am very concerned about security so I discounted many public cloud providers.  Dropbox, and pretty much any American based solution is just not trustworthy with todays laws.  Dropbox employee’s can even get into your files if they need to regardless of the safeguards they claim to have I do not want anyone being able to get into my stuff unless I let them in.  That security concern lead me to MEGA.  They weren’t US based and they don’t have a way into your data.  Their plans were more expensive than most cloud providers however their 2TB plan more than covered my needs.

One downside was that MEGA didn’t have any built in backup solution for the Diskstation.  That meant I could backup my data to the cloud but the Diskstation wouldn’t be my main data source.  I could still use it as a home server but not the home of my data.  I wasn’t really pleased about that however I couldn’t really find an alternative that worked.  So earlier this spring I jumped into using MEGA as my cloud sync / backup solution.  It had a lot of what i wanted but it wasn’t perfect. It was the best of what was available that worked for me.

I spend a few weeks seeding and confirming my data was correct in MEGA before trying to make a sync copy onto my Laptop.  That was when I ran into issues.  What i noticed was after a while syncing files I would get to a point were the MEGA agent would freeze.  Sometimes after only 70-100 files downloaded.  I would restart the agent and it would do the same thing and copy a few more files and then stop.  I couldn’t really figure it out.  I tried reinstalling, putting the agent on another machine, checking their online help but i was unsuccessful at finding a solution.  This was an issue on top of a general annoyance I had were it would take 5-10 minutes for my admin page on the MEGA site to load.  I don’t know what it was doing but in any browser (chrome, firefox, or safari) i would have the same issue.  that wasn’t a show stopper however add to it that i can’t download a replica of my data onto my laptop I was very concerned.

After a week of tinkering I gave up and had no choice but to revert my decision and go back to Cloud Station software by Synology.  I didn’t want to bother calling MEGA.  I had a confidence issue and even if they could fix the issue in 10 minutes with my level of technical knowledge and online tools i should be able to figure it out.  If i can’t i have concerns about their service.  It wasnt like my problem was complex.  The system just stopped syncing data after a few files.

I wasn’t pleased however MEGA was never the perfect solution for me.  I knew that going into it but thought i could make it work.  In the end I couldn’t so I went back to Diskstation as my primary data store and used Cloud Station to sync.  I kept an eye on the data i synced to make sure i didnt have a repeat issue.  My plan to build another DiskStation and leave it with a friend was back on the table.  That was until about a week ago.

I don’t know what got me to look at BitTorrent Sync again however I installed it again.  I knew they were actively releasing newer versions so I had hoped that what happened the last time was an issue that was solved after a few months.  I was pleasantly surprised to see after some brief testing that the UI display issues appeared to be solved.  I slowly over a few days turned off syncing via Cloud Station and enabled BitTorrent Sync.  I made backups before my changes just in case.  The UI consistently worked on the desktop.  For the web interface on the DiskStation what i learned was first safari wasn’t so great for it.  Second that clearing my cookies for the site typically solved the issue.  With that issue resolved I moved most of my shares over to the BitTorrent Sync app within a week.  I was originally going to try out the system for 30 days and then decide if i wanted to pay for the Pro version.  After going over the 10 share limit for the trial i opted to pay for the 1 year Pro for $39.

As of right now I have my desktop, DiskStation NAS and laptop all replicating data with the new Sync software.  I am at a steady state like were i was back in March.  This steady state took a lot of research and trial and error however now is the harder part.  Now I need to finish this project and meet my objective of a true private cloud with data located in multiple locations.

Using BitTorrent Sync gives me a few options I didn’t have with the Synology centric Cloud Station.  My remote backup locations do not need to have a DiskStation.  I have two old Mac Mini’s that I could provision with external hard drives and drop them in remote locations.  That was my original idea at least. Now I am thinking bolder and cheaper.  If I am successful I will have a prototype to talk about soon.

Raspberry Pi Cloud Node Prototype

The first phase of my BitTorrent Sync project is mainly complete.  I now have the Sync software running on my Diskstation, Mac Mini desktop and my Macbook Pro laptop.  I have replaced my Sinology Cloud Station app and all files are kept up to date using BitTorrent Sync.  That change gets me to a point were I was before with Cloud Station.  I have access to all my files everywhere however I do not have a complete backup solution since most of the data is only current in two places.  Those places are 15 feet from each other and not really giving me much disaster recovery.

The next phase of my backup / cloud strategy is to have offsite copies of my data or large parts of my data in case of anything happening at home.  I originally planned to do this with a 2nd Sinology Diskstation somewhere.  The costs were very high and that solution limited me to one other location.  When I was thinking of using Cloud Station software from Sinology that solution made sense, however now that I got BitTorrent Sync to work I have other options.
Raspberry Pi & Drive

is bit torrent sync my answer to sync situation