Nov 012016
 

Today I want to repost a great article first posted on sysadvent blog.

I think it’s a great post that show how to integrate different software to achieve a modern continuos integration.

Original article by:
Written by: Paul Czarkowski (@pczarkowski)
Edited by: Dan Phrawzty (@phrawzty)

Docker and the ecosystem around it have done some great things for developers, but from an operational standpoint, it’s mostly just the same old issues with a fresh coat of paint. Real change happens when we change our perspective from Infrastructure (as a Service) to Platform (as a Service), and when the ultimate deployment artifact is a running application instead of a virtual machine.

Even Kubernates still feels a lot like IaaS – just with containers instead of virtual machines. To be fair, there are already some platforms out there that shift the user experience towards the application (Cloud Foundry and Heroku come to mind), but many of them have a large operations burden, or are provided in a SaaS model only.

In the Docker ecosystem we are starting to see more of these types of platforms, the first of which was Dokku which started as a single machine Heroku replacement written in about 100 lines of Bash. Building on top of that work other, richer systems like Deisand Flynn have emerged, as well as custom solutions built in-house, like Yelp’s PaaSta.

Continue reading »

Flattr this!

Oct 302016
 

Guest post by Emma.

Home automation technology has struggled to gain serious traction because, for all its promised convenience, the current tangle of cords and communication standards still has consumers in knots. New quote-unquote-smart systems have made huge strides recently, but not everyone has chosen invest – yet. Totally “smart” homes still await the mainstream. To unlock this new market, technology companies are looking to a tool we’ve taken to heart already: the smart mobile phone.

Continue reading »

Flattr this!

Jul 172016
 

This is a small update (1 year later) of a great article by Ilmari Kontulainen, first posted on blog.deveo.com.

I’ll post the original article in blockquote and my notes in green.

Storing large binary files in Git repositories seems to be a bottleneck for many Git users. Because of the decentralized nature of Git, which means every developer has the full change history on his or her computer, changes in large binary files cause Git repositories to grow by the size of the file in question every time the file is changed and the change is committed. The growth directly affects the amount of data end users need to retrieve when they need to clone the repository. Storing a snapshot of a virtual machine image, changing its state and storing the new state to a Git repository would grow the repository size approximately with the size of the respective snapshots. If this is day-to-day operation in your team, it might be that you are already feeling the pain from overly swollen Git repositories.

Luckily there are multiple 3rd party implementations that will try to solve the problem, many of them using similar paradigm as a solution. In this blog post I will go through seven alternative approaches for handling large binary files in Git repositories with respective their pros and cons. I will conclude the post with some personal thoughts on choosing appropriate solution.

Continue reading »

Flattr this!

Jun 102016
 

If you walked into the office this morning to find that your customer information had been compromised, or a disgruntled employee had wiped a database clean, would you be prepared? Have you set preventative measures in place to safeguard you against total loss? Do you have security features in place to help you retrieve lost data? Are you able to continue with business as usual or would a security breach such as this bring you to a standstill?

It’s a lot to think about, but according to USA Today, approximately 43% of businesses encountered a data breach at some level in the year 2014. With percentages like this, the likelihood of it happening to your business is high. So again, are you prepared? Below are a few signs to determine whether your data loss prevention plan is intact or if your company’s data is vulnerable:

Continue reading »

Flattr this!

May 032016
 

Guest Post by Oliver.

Linux Mint is one of the most powerful and elegant Operating system. The purpose of launching a Linux Mint in the market is to provide the modern and comfortable OS which is easy to use for every user. The best thing about Linux Mint is that it is both free and Open Source. Also, it comes at the third position after Microsoft Windows and Apple Mac OS. Due to its simplicity and clean design, it is worth for the user. Many users prefer it because it works well on low end machines or old hardware components. The latest version of Linux Mint is 17.3 and one can easily download it from the Search Engines. The process of Installing Linux Mint on your Windows PC is quite simple. Let’s see how we can install the Linux Mint on our System.

Go ahead and get started.

mint17_1

Continue reading »

Flattr this!