Subscribe: Tiernan's Comms Closet
http://feeds.feedburner.com/lotas
Added By: Feedage Forager Feedage Grade B rated
Language: English
Tags:
backup  data  dropbox folder  dropbox  files  folder  git mercurial  git  jekyll  mercurial  new  phone  server  site 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Tiernan's Comms Closet

Tiernans Comms Closet



Dont Ask



Published: Mon, 08 Feb 2016 15:09:40 +0000

Last Build Date: Mon, 08 Feb 2016 15:09:40 +0000

 



Cloudflare and the Future

Fri, 14 Aug 2015 00:00:00 +0100

So, you may have noticed that the site has been up and down a bit for the last day or so… And now its over SSL. Well, thats because now this site is hosted behind CloudFlare with SSL and SPDY enabled. But thats just one part of this post… Looking at the sites i run, i have decided to make some changes. The sites running are: tiernanotoole.ie which is a technical blog and my main email account blog.lotas-smartman.net which is this site, the oldest site i have. Email is still on here, but its more a legacy site… geekphotographer.com is a photography released blog with a technical side tiernanotoole.net which will be a more network related site (not live yet) tiernanotoolephotography.com which will host my photograpy stuff, more business than technical… miniblog.tiernanotoole.net, a Tumblr site for quick content, more “personal” than tech and mostly photos… there are a few more (well, 30 or so domains!) but it would take too long to explain, and i cant even remember what half of them are or where… So, this site, going forward, will be more “Long Form” posts, much like the 5+ Screens and a Cloud post from 2011. future posts like the Using Dropbox as a personal Git and Mercurial Storage area post will probably go to tiernanotoole.ie and then shorter quick posts will live on the miniblog.tiernanotoole.ie. all the blogs have RSS feeds to subscribe to, so if you want to subscribe, go forth and do so… Thanks. –Tiernan [...]



Blog move details

Fri, 24 Apr 2015 00:00:00 +0100

yesterday i wrote that i had moved my site to Jekyll and i also mentioned i would post how how i was doing this. Well, this is that post. First, getting data out of Wordpress was a bit of a pain (in my case… screwing around with NGinX and PHP-FPM, but that is out of scope for this blog post… if you have a “normal” enough blog check out Jekyll Exporter (geekphotogrpaher and Tiernans Podcast both wored first time…)… If not, your on your own… Anyway, i got a zip file out of Jekyll Exporter that included all my posts (some had to be modified…) and all my images. Styles etc, not so much… But new blog, new style! (if your on an RSS reader, check out the new site here. So, then i installed Jekyll on my machine… Again, out of scope for this post, but the Jekyll website had details on installing and running it. I installed Sabayon Linux on my Godbox 2 a while back… more on that eventually… so, once Jekyll was installed, i created a new site by typing jekyll new blogname this created a load of files under the blogname folder. I then did some tweaking… i copied all my original posts, images and pages to that folder and then ran jekyll serve this generates all the HTML (puts it into the _site folder in your blog directory) and then serves the html at http://127.0.0.1:4000. Hitting that page in your favourite browser then shows you how it will look. I did some tweaks to plugins and Permalinks to get the site the way i wanted it, but once completed, i just ran jekyll build all my final files where then in the _site folder. I then use RSync to transfer the contents of my _site folder to the server and heay presto! its live! Well, semi live… The main server is hosted in house, on an Ubuntu box running NGinX. But i have 3 internet connections, so i have an OVH box in France running both Varnish as a front end and HAProxy as a backend. HAProxy is pointing at the 3 WAN ips i have, and if one falls over, it take it out of the loop. Varnish then is pointing at that and doing caching… Finally, to make things even more complicated (but hopefully faster) I use Rackspace Cloud, which in turn uses Akamai as a CDN for the site… So, there you have it. It is a lot more complicated then it needs to be. I could easily just generate the site and push to Amazon S3 and do what i am doing on tiernanotoole.ie or even easier, use Github Pages. Hell, I use GitHub to host the site (privately) already, and if i use them to host it, they would build the bloody thing too! Anyway, its working, im happy, and hopefully it means i will post more often… then again, not sure… One other thing i need to do, or at least think about doing, is to automate it… One of these days… Either way, checkout the tiernanotoole.ie site, since there is a lot more technical stuff there, or geekphotographer which is, obviously, for geeks and photographers! If you are looking at starting a blog using Jelyll, Check out the following links: Jekyll main site Jekyll Bootstrap Jekyll Resources Jekyll Template Guide Jekyll Getting started guide UPDATE 2015/08/13: This site is now hosted on a box in the house, but fronted by CloudFlare. [...]



Finally moved to Jekyll

Thu, 23 Apr 2015 00:00:00 +0100

Well, yesterday, after a very long abense, i said i was planning to move this site to Jekyll and after many years of thinking about it, it has finally happened. If your reading this, its been done… I will post more about Jekyll, how i moved, etc, soon.

As for now, the site should still have all the same stuff the old one did, and shout be a LOT faster, given there is no HTML generation done when you hit the page. Also, with the help of Rackspace and Akamai this site should fly since its hosted beind their CDN product. Next is to move the photography blog and the podcast, though the podcast will probably be retired sooner rather than later…

(image)



Not dead, just lazy

Wed, 22 Apr 2015 00:00:00 +0100

(image) well, it’s been a while. Nearly three years. Been busy and lazy for the last while. Have started blogging at Tiernanotoole.ie using Jekyll, and been trying to figure out what to do with this site. The Geek photographer blog is still there, and between this and that, most of the updates are to the WordPress install itself. 

I have been thinking of moving this site to Jekyll. Would just then be html being served, no database, no security updates, but i would loose the ease of posting, like now though my iPhone. 

So, let’s see what happens. Might get pissed off upgrading WordPress and managing the servers and move sooner rather than later. What could possibly go wrong? 

[update] this is what happens when you don’t post a lot. My caching has gone nuts and that image, which was put up as an after though, won’t resize… Feck it anyway!

(image)



BITS transfer over Remote Desktop with Powershell

Fri, 25 May 2012 00:00:00 +0100

So, I have a dedicated server and the machine acts as a Hyper-V host. Only Remote Desktop is enabled on the box, for security, but the VMs have their own ports open (HTTP, SSH, etc). Transferring files between the Dedicated Server and my home network is a bit of a pain… one way is to open a port locally (SSH or FTP) and download the file from home to the dedicated box… bit of a pain, but it does work… But now i have found some magic in BITS Transfer! Bits Transfer allows you to download and upload files in the back ground, intelligently… Hence the full name of Background Intelligent Transfer Service… So, how to you get this to work with Remote Desktop? In the Remote Desktop Client, under the Local Resources Tab, under Devices and Resources, click the “More” option and select the drives you want to share out to your server. Remember, the server has access to these files… I am not sure if everyone on the server can see them, so be careful. Connect to your server and go into Computer… you should, by default, see your extra drives… they dont have real drive names though… So, what you need to do is go to the address \tsclient (should be always the same…) and you will see the drives you have shared. find the drive your interested in, find the exact file you want to copy over an Shift+Right click on the file. You will see an option to Copy As Path. Click this. Next, Open PowerShell (Should be installed on all servers, if not Google for it with Bing and find out how to install it. Once powershell is either opened or installed and opened, type “Import-Module bitstransfer”. now for the magic commands: Start-BitsTransfer -source $sourceFile -destination $localtionFile That is it… set $sourceFile to the URL of the file you want to download, and $locationFile to the place you want it placed. Powershell will show an progress bar of how the transfer is doing… But, what happens if the connection drops before the download finishes? Simple. Open your connection again, you should see your original powershell window. In here type: Get-BitsTransfer | Resume-BitsTransfer Now, this is looking for ALL bits transfers on the system, and then resuming them… You may want to check and see what Get-BitsTransfer shows as a result… In my case, i only want to start the jobs which are marked as having an error “TransientError” so my PowerShell Command is: Get-BitsTransfer | Where-Object {$_.JobState -eq “TransientError” } | Resume-BitsTransfer You could also use the Job ID if you have one (in my screen, its being truncated…) So, What about downloading FROM the server to your local machine? Well, in the Start-BitsTransfer, swap the source and destination… Source is a local file on your server and destination is the fileshare on TSClient you want to upload to… and the resume upload and download will work to. Another interesting option is to add the -Asynchronous option to the Start-BitsTransfer command. This will run the command fully in the background. there is aslo the option of using the -Suspend, which only adds the job, but does not start it immediately. In this case you can see some info about the download using the Get-BitsTransfer command. Finally, you need to run one final command: Complete-BitsTransfer. So, You would call Get-BitsTransfer | Where-Object {$_.JobState -eq “Transferred” } | Complete-BitsTransfer As the MeerKat says, Simples! [UPDATED] Want to monitor the progress of the download? bitsadmin /monitor will show you the status of your transfer! Handy! [...]



Infinite Monkey Twitter

Wed, 04 Apr 2012 00:00:00 +0100

The Infinite Monkey Theorem states: a monkey hitting keys at random on a typewriter keyboard for an infinite amount of time will almost surely type a given text, such as the complete works of William Shakespeare. This made me think… and more importantly, think about Twitter… I am thinking about my Final Year college project, and have been playing with the Twitter Streams API for the last few days. So far, i have about 650k of tweets to play with, but it made me think about using “fake” data for tweets… take the following as an example: monkey tosses coin to say if its a response or a new tweet. if its a response, a new coin is used to figure if he responds to some he follows or someone he does not follow… if he does not follow, finally tosses a third time to figure if he wants to respond to a “famous” (more than 10000 followers) or not other monkey… if response, find a tweet to respond to monkey randomly takes a number from 1 to (140 – 1 (@) + twitter user name + 1 (space)). this is char count monkey now tosses a coin to figure if he wants to add a hash tag… if yes, he tosses again to either use a random trending hashtag or a completely different one… finally, the monkey, using the number of characters left, randomly hits the keyboard and makes a nonsense tweet… this is much how i tweet at @tiernano anyway, using this as an idea, and adding in monkeys following each other depending on how they feel (more coin tossing) you could get a lot of tweets using some lower powered machines (worker nodes) and some beefy hardware… this is starting to sound like an interesting problem… now to figure out more… leave it with me… [...]



Tiernan’s Podcast, Episode 1

Tue, 13 Mar 2012 00:00:00 +0000

So, my new podcast is now live! I am a podcaster again! for the next 3 weeks (Today, Tuesday 20th and Tuesday 27th) a new episode will auto publish. If you want to get them on your iPod, iPhone, i(Insert name here), Windows Phone, etc… Subscribe to the PodCast Feed here. Or you can download the files directly from here.

(image)



5+ Screens and a cloud

Wed, 30 Nov 2011 00:00:00 +0000

Microsoft have talked about their idea of “Three screens and a cloud” for the future of computing. The idea is you have your PC (Laptop or Desktop, or Tablet once Windows 8 ships), your Smart Phone and your TV with an Xbox connected. And all your data is shared between them with the cloud… but it has made me think… Why only 3 screens? Why not 5? Why 5+ The 5+ screens I am thinking about are as follows: The usual 3 suspects: Main PC, Smart Phone and TV… Your Watch A Tablet (secondary machine) Possibly your car in-dash screen (could be your smartphone or use your smartphone for processing and internet connection) The idea of using your watch as a screen is something that is not exactly new. The original Microsoft based Spot watch used an FM signal to get data from the MSN direct service. This included weather, and if I remember correctly, also included calendar items. In recent times, companies like Sony Ericson have developed their LiveView: a screen which connects to your Android phone using Bluetooth and shows information like phone status, weather, Facebook friends, and more.   iPod Nano If you look at the new iPod Nano, we are getting close… there are even watch straps for the Nano. Apple’s latest Nano came with extra watch faces since people wanted it! But the Nano is missing a couple of thing: Really needs Bluetooth! Not only for listening to music with, but also for talking to different devices… imaging if your Nano talked to your iPhone and got data streams like news, weather, phone call, email and SMS information, your next calendar appointment? Apps! It seems as if the Nano is running a micro copy of iOS, so in theory, it should be able to run iOS apps… slightly modified apps mind you. And I can’t see it having enough power to run Angry Birds, but basic info apps should be enough… Vibrate function: the ability to “buzz” when something happens… I carry my phone in an inside pocket or a pocket with other things, so I don’t usually feel it buzzing or hear it ringing… but if my “watch” did, I would notice! A proper docking station: I would like, before I go to bed, to stick my “watch” in a docking station, which both charges and syncs contents to the device. It would also sync with the dock, telling it what time I want to be woken up the next morning. Your Car My car is currently a 2006 BMW 520D. It has the professional navigation system, which includes an 8.5” widescreen display in dash board. This is used by most of the functions in the car: audio, navigation, communications, in car information and climate control, all controlled by the iDrive. If your car could connect to your phone (mine already does for Bluetooth phone calls) to get information, data and more, it could be much more interesting: Any music on your phone available at your fingertips. Be able to call up data from the internet, giving you tips on traffic, your latest calendar appointments, friends’ check-ins, and more. Your car being able to send data to your mechanic over the internet for diagnostics information. It could also send data like millage, fuel usage, etc., so you can see how your fuel economy is doing… Apps, working both on your phone and your car, for operations like Navigation and Music. BMW already have this with its ConnectedDrive which does a lot of these functions, but currently only works with the iPhone. Your secondary machine Most people have one main PC. Be it a big honking workstation, like the GodBox, or a standard Laptop or Desktop machine, it usually is something fairly powerful, which could be used for anything from Internet browsing and email, to syncing your music and videos to your devices (iDevices, Windows Phones, etc) to video editing and photo processing, to developing code. Either way, it’s a large enough machine with enough power t[...]



New Backup plan… out with Jungle Disk and ZManda Cloud Backup, in with CrashPlan, MySQLBF and SqlBF…

Thu, 10 Nov 2011 00:00:00 +0000

[UPDATE: After a few days, i tried CrashPlan Pro on a 30 day trial… Now i find out that its not available outsite the US and Canada… Its a bit hiden in their FAQ… So, back in search i go!] My current backup strategy is a long and convoluted one, but it did work… until i shut a lot of stuff down… now i am starting to re-think my plan… first the existing “plan”… my main laptop, a Mac Book Pro, is backed up to my Main Workstation (AKA the GodBox) via iSCSI. The GodBox is running Windows 2008 R2 Server, and has the free Microsoft iSCSI Target installed. I am sharing 2 300Gb VHDs to the MacBook Pro, and those are in RAID 1 on the laptop. Time Machine then backs up to this drive. Also, the iTunes Music, Movies, Apps and TV Shows, and also photos from my camera are also backed up to Jungle Disk. Currently this weighs in at about 160Gb… The GodBox itself has a couple of RAIDed Drives (I know, i know… RAID is not backup…). So, my Photos are stored on a RAID 1 array, which is then backed up using Jungle Disk. The photos college is weighing in at about 180Gb or so… probably more. I have a dedicated server in Germany (this is where this site is being served from… Hello from Germany) and that has the actual contents of the site, database, images, etc. these are currently backed up regularly (every 3 hours for a incremental backup, nightly for a Full backup). I am using ZManda Cloud Backup for this. ZManda has the advantage of backing up SQL and MySQL server, as well as SharePoint and Exchange. There is a machine in the house which acts as an internal Git server (Linux, SSH, Git… simple). JungleDisk was installed on this and backed up the folder where Git lived. works grand… So, there are a couple of outstanding issues with this setup: for the amount of data i am storing (About 500Gb give or take) with JungleDisk, its costing me about $80 a month. Its costing me about the same for a Dedicated Server, and it is about the same cost of a 1Tb drive, every month! I have logged into the Dedicated server a few times over the last while and found that ZManda was using 900Mb of RAM. Restarting the service solves the issue, but its still a bit of a pain to restart every few days. ZManda released a Version 4 of their ZCB software. I am still on 3.x since MySQL backup was not added to Version 4 yet… If i want to use the same software to backup files around the house, but not in the cloud, ZManda can do it, but JungleDisk cannot. I am using 2 different backup systems… JungleDisk charge $2 per client machine per month, and $4 per server per month (includes some storage with that price… check their site for the latest). Storage is then either charged at about $0.15 per month if you use their API key, or if you use your own, Amazon will charge you… They also use with Amazon S3 or RackSpace Cloud for storage. ZManda is a one off fee per month ($4.95) and you can use that one license to backup as many machines you want. storage is charged at $0.15 per GB, and they are offering 25Gb free… the 25Gb is based on each AWS Region giving you 5Gb free… I suppose since a new Region just opened in Oregon, that could now be 30Gb… So, i went looking, and i think I found a solution… Some investigation still needed, but we are getting there… here is the plan. The MacBook Pro will still be using TimeMachine to backup to the GodBox using iSCSI, but the plan will be the GodBox gives it 500Gb of storage already RAIDed… Music, Movies, Photos and Apps will be backed up to CrashPlan. They plan i am looking at will cost me about $12 per month and give me the ability to backup up to 10 machines and gives me Unlimited space… We all know what Unlimited means, but it should be enough… Also, CrashPlan gives me the option to backup to my own machines. So, these files will also [...]



Using Dropbox as a personal Git and Mercurial Storage area

Tue, 01 Nov 2011 00:00:00 +0000

This is something i have been using for a while now, and though it might be an idea to write a quick post explaining how to set Dropbox up as a folder to push your Git and Mercurial files to… But first, why would you want to? So, you have a project you want to work on with multiple machines, and you want a source control system to easily manage files and state. You decided to use Git or Mercurial. I am not going to tell you which one to use… I use both for different projects. Anyway, you have decided to use Git or Mercurial, and you now want to be able to access the files from multiple locations. Since these locations are all your machines  (hopefully), you need to have a location that all machines can access. so, with the help of this tutorial, you can use Dropbox as your shared location. the following steps will need to be setup only once. When adding extra machines, follow the steps further on down. First, you will need a Dropbox Account. If you don’t have one, click here and sign up. download the Dropbox installer, and link your machine. when installing, you will be asked to set the default location of your Dropbox folder, or a custom location. I picked the default location, and as i am running Windows 7, my Dropbox location will be at c:\users\tiernano\dropbox. yours will be different… next, you will need either Git or Mercurial for Windows. msysgit is what i use for git on Windows. Mercurial have a Windows installer also. so now, we will branch out into either Git or Mercurial tasks: for GIT: Open Git Bash from the Start menu you are now in a Unix-ish style command prompt… if you are not familiar with Unix… this might be interesting… first, your Dropbox location will be at ~/Dropbox (~/ being your home directory) cd into your ~/Dropbox folder, and create a folder called git (this is what i did, you may want to be different) cd into your git folder, and create a new directory called yourproject.git (again, change it to your liking…) cd into the yourproject.git folder and type git init –bare. this creates a bare repo for you to use. since this is in your Dropbox folder, all your machines synced with Dropbox will get this folder soon… next, create a working folder. decided where you want to put this (ideally, not on your Dropbox folder) so, in my case i have a folder in my Documents folder in my home directory called project. I type cd ~/Documents/project to get to this folder. when in this folder, type git init. this folder is now a git working folder. add your files as needed. **git add ** will add a file to git. **git commit –m “your message” –a** will commit all modified files on your repo with the message “Your Message” to the repo… Have a look for a [git Tutorial][4] to find more. next, add a remote to your current repo. git remote add dropbox ~/Dropbox/git/yourproject.git will add your dropbox as a remote repo.  typing git push dropbox master will push your master branch to Dropbox . that’s most of what you need to know. For Mercurial, the following needs to be done open a command prompt, and cd into your Dropbox folder. create a folder, i called mine hg (the mercurial command is HG…) in your hg folder, create a folder called myproject.hg. cd into the folder and type hg init. this creates a Mercurial directory. next, go to the directory you want to use for your project. type **hg clone **. if hg is setup correctly, all should be grand. hg add and hg commit files, and then hg push to the Dropbox location. if you want to add a new machine to your git repo, find your directory you want to clone to and type git clone ~/Dropbox/git/yourproject.git yourproject. your working directory is now in the yourproject folder. git push will push back to Dropbox . git pull will[...]