<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>qone.io</title><link>http://qone.io/</link><description>Recent content on qone.io</description><generator>Hugo</generator><language>en-us</language><lastBuildDate>Tue, 20 Dec 2016 00:00:00 +0000</lastBuildDate><atom:link href="http://qone.io/index.xml" rel="self" type="application/rss+xml"/><item><title>different sides of devops in recrutment and engineering</title><link>http://qone.io/posts/devops-in-reality-vs-recrutment/</link><pubDate>Tue, 20 Dec 2016 00:00:00 +0000</pubDate><guid>http://qone.io/posts/devops-in-reality-vs-recrutment/</guid><description>&lt;p&gt;The question is what is devops? How do you recruit a devops engineer. The short answer is you don&amp;rsquo;t. a devops engineer does not exist. The thing about recruiting a so called devops engineer is all a bit undefined and often miss understood and is often used by head hunters and recruiters to find software engineers/ops engineers.&lt;/p&gt;
&lt;p&gt;If we look at it a bit in short first, what is devops! devops is a way of working in a engineering organisation. The way of working generally means a few thing. You start by moving the organisation from a &lt;code&gt;itil&lt;/code&gt; or &lt;code&gt;waterfall&lt;/code&gt; to a more &lt;code&gt;agile&lt;/code&gt; way of working. What this means is that you strive to have a culture that allows people freedom under responsibility, short development cycles, allowing quick adoption and changes, to move fast. The one big part here is &lt;code&gt;trust&lt;/code&gt; in the engineers.&lt;/p&gt;</description></item><item><title>simple prometheus exporter in python</title><link>http://qone.io/posts/docker-prometheus-exporter-python/</link><pubDate>Sun, 28 Aug 2016 00:00:00 +0000</pubDate><guid>http://qone.io/posts/docker-prometheus-exporter-python/</guid><description>&lt;p&gt;writing a simple prometheus exporter to collect metrics from a external system that needs monitoring. I will use the python official &lt;code&gt;prometheus_client&lt;/code&gt; package for python and &lt;code&gt;falcon&lt;/code&gt; to serve the exporter.&lt;/p&gt;
&lt;p&gt;notise the following example is written with the assumtion that you are collection metrics from some other systems when multiple services is written in the same framwork and have a generic set of metrics. That is why there is a service label included. Normaly you should get the service name, from your service discovery.&lt;/p&gt;</description></item><item><title>lessons learned by running docker on mesos and aurora in production</title><link>http://qone.io/posts/lessons-learned-by-running-docker-in-production/</link><pubDate>Sun, 21 Aug 2016 00:00:00 +0000</pubDate><guid>http://qone.io/posts/lessons-learned-by-running-docker-in-production/</guid><description>&lt;p&gt;Short about how we run docker. The way we are running docker using aurora to scedule on mesos currently both on aws and on on premise. Every service have a aurora job defined. When a job is sceduled the service allocates resources from mesos and starts a job. That job is a set of one or multiple docker containers. This is all good and works ok.&lt;/p&gt;
&lt;p&gt;The difficulties with running docker is not the running part. When we are working with life cycle of a service. This means that the amount of stopped containers, old version of images and unused images, and volumes, will start to eat up disk space. This is something that docker have not solved nicely yet. There is a lot of efforts from the community to come up with suggested solutions to purge junk that is building up while running it in production.&lt;/p&gt;</description></item><item><title>killing the traditional quality assurance way of testing</title><link>http://qone.io/posts/breaking-the-traditional-testing-trend/</link><pubDate>Sun, 07 Aug 2016 00:00:00 +0000</pubDate><guid>http://qone.io/posts/breaking-the-traditional-testing-trend/</guid><description>&lt;p&gt;After years in testing roles it&amp;rsquo;s obvious that the traditional way of doing testing will no longer hold up. What i mena with the traditional way of testing is. Quality assurance, you have separate development and testing teams i.e QA teams. When the development team is done they deliver to QA and they do there work. QA follows a long and usually static test plan and process, your not suppose to step outside the boundaries of the test plan or the process. This way of working is not only slow it disconnects the guys working with testing and development.&lt;/p&gt;</description></item><item><title>Docker alpine smaller image footprint</title><link>http://qone.io/posts/docker-apline-image-size-improvments/</link><pubDate>Wed, 20 Jul 2016 00:00:00 +0000</pubDate><guid>http://qone.io/posts/docker-apline-image-size-improvments/</guid><description>&lt;p&gt;Working with docker images to minimise the footprint i.e the size if a image. There is a few things that you can do to get smaller images. I will show some examples for a small go and python3 service built in a Debian and alpine linux based image to compare the result and the footprint that a image.&lt;/p&gt;
&lt;p&gt;The key to building small docker images is only use &lt;code&gt;one&lt;/code&gt; &lt;code&gt;RUN&lt;/code&gt; step in the Dockerfile. Why you might think. Every RUN in docker is a layer. The layer will contain what you do in that layer like adding package cache. If you then remove the cache in a later RUN you will still have it in the parent layers. So what you do is use al lot of &lt;code&gt;&amp;amp;&amp;amp;&lt;/code&gt; in the same RUN and in the end remove the files and cache you don&amp;rsquo;t need. Selecting the base image will affect you the most when it comes to footprint. I will look on Debian and alpine based images.&lt;/p&gt;</description></item><item><title>Fabric dynamic hosts tasks</title><link>http://qone.io/posts/python-fabric-dynamic-hosts/</link><pubDate>Thu, 30 Jun 2016 00:00:00 +0000</pubDate><guid>http://qone.io/posts/python-fabric-dynamic-hosts/</guid><description>&lt;p&gt;Using fabric with dynamic hosts. In the fabric documentation most of it is using examples and is assuming that you you have a static set of hosts. That is no longer the case when we are working with cloud and a dynamic amout of hosts. The way the fabric want&amp;rsquo;s you to run tasks is to use the &lt;code&gt;fabric.api.env&lt;/code&gt; function to set the hosts you have. If you now want to do this on the fly you have one option a function called &lt;code&gt;execute&lt;/code&gt; avalible in &lt;code&gt;fabric.api&lt;/code&gt; that takes a key &lt;code&gt;hosts&lt;/code&gt; this key will be used to tell fabric to run on this hosts as if you had set &lt;code&gt;env.hosts&lt;/code&gt; .&lt;/p&gt;</description></item><item><title>why do we do need unit tests</title><link>http://qone.io/posts/testing-with-unittess/</link><pubDate>Sun, 05 Jun 2016 00:00:00 +0000</pubDate><guid>http://qone.io/posts/testing-with-unittess/</guid><description>&lt;p&gt;You might be thinking well unit tests is kind of a obvious thing that you have when you are
writhing code. If you do you are wrong. Not all developers want and can write unit tests.
This is kind of interesting for me.&lt;/p&gt;
&lt;p&gt;Short about me. I am a software developer in test, i have been spending years on designing,
test automation and writing tooling to help other developers to to be more efficient in
finding issues in the code they deliver. And strategies around quality and testing. Mostly
on the backend and infrastructure area. Also being part of the design phase of a service
to help point out risky areas.&lt;/p&gt;</description></item><item><title>python click bash zsh auto complete same files</title><link>http://qone.io/posts/python-click-auto-complete-bash-zsh/</link><pubDate>Sun, 24 Apr 2016 00:00:00 +0000</pubDate><guid>http://qone.io/posts/python-click-auto-complete-bash-zsh/</guid><description>&lt;p&gt;Sharing the same auto completion for zsh and bash. In zsh there is somthing called &lt;code&gt;bashcompinit&lt;/code&gt; that can be used to share the same completion for bash and zsh. here is a small example project that will fix completion for you when you are not using a nested command structure. see next post about nested completion for bash/zsh&lt;/p&gt;
&lt;p&gt;project structur&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#282a36;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;"&gt;&lt;code class="language-bash" data-lang="bash"&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;├── MANIFEST.in
&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;├── Makefile
&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;├── auto_compleate_install.sh
&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;├── ecli
&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;├── ecli_lib
&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;│ ├── __init__.py
&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;│ └── main.py
&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;├── requirements.txt
&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;└── setup.py
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;click example cli command&lt;/p&gt;</description></item><item><title>python click bash zsh auto complete same files nested command groups</title><link>http://qone.io/posts/python-click-auto-complete-bash-zsh-nested/</link><pubDate>Sun, 24 Apr 2016 00:00:00 +0000</pubDate><guid>http://qone.io/posts/python-click-auto-complete-bash-zsh-nested/</guid><description>&lt;p&gt;Sharing the same auto completion for zsh and bash. In zsh there is somthing called &lt;code&gt;bashcompinit&lt;/code&gt; that can be used to share the same completion for bash and zsh. here is a small example project that will fix completion with command groups and nested commands.&lt;/p&gt;
&lt;p&gt;project structur&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#282a36;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;"&gt;&lt;code class="language-bash" data-lang="bash"&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;├── MANIFEST.in
&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;├── Makefile
&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;├── auto_compleate_install.sh
&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;├── ecli-complete-nested.sh
&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;├── ecli-nested
&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;├── ecli_nested_lib
&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;│ ├── __init__.py
&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;│ └── main.py
&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;├── requirements.txt
&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;└── setup.py
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;click example cli command&lt;/p&gt;</description></item><item><title>python protobuf over http</title><link>http://qone.io/posts/python-protobuf-over-http/</link><pubDate>Mon, 11 Jan 2016 00:00:00 +0000</pubDate><guid>http://qone.io/posts/python-protobuf-over-http/</guid><description>&lt;p&gt;The point of this is a smal example on how you could use protobuf to send data over http. In this example i will be using Python and the packet falcon for the server. A command line tool as the client. The example will just be a simple ping/pong containing a message, channel, and PING or PONG sent to the server. The server will respond with the same message and channel and a PONG.&lt;/p&gt;</description></item><item><title>jenkins test result dashboard</title><link>http://qone.io/posts/jenkins-test-dashboard/</link><pubDate>Sat, 14 Nov 2015 00:00:00 +0000</pubDate><guid>http://qone.io/posts/jenkins-test-dashboard/</guid><description>&lt;p&gt;Minimal Jenkins test result dashboard&lt;/p&gt;
&lt;p&gt;&lt;img src="http://qone.io/imgs/dashi-demo.png" alt="dashboard img"&gt;&lt;/p&gt;
&lt;p&gt;The background to making the dashboard is as part of the on boarding at work we are able to do a small projekt. We are using Jenkins as the tool for our continuous integration tests like many others. Since most of the Jenkins plugins that exists for build information is not that pretty, it made a lot of since to do a minimalistic dashboard for the test results. And why not do something in React as the frontend rendering engine and python as the backend. The dashboard is also using redis cache for the result data, and in the docker stack haproxy to terminate http on port 80. The frontend polls the backend every 15 sec for new data.&lt;/p&gt;</description></item><item><title>backup of linux vps using gsutil</title><link>http://qone.io/posts/backup-linux-gsutil/</link><pubDate>Sun, 30 Aug 2015 00:00:00 +0000</pubDate><guid>http://qone.io/posts/backup-linux-gsutil/</guid><description>&lt;p&gt;Backing up linux servers to google clound storage with gsutil using the storage option nearline. What is needed to do this is a google cloud accound. Start by creating a new project for backup. After that use that Project ID to configure gsutil in the next step.&lt;/p&gt;
&lt;p&gt;start by configuring gsutil.&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#282a36;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;"&gt;&lt;code class="language-bash" data-lang="bash"&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;$ gsutil config
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;Create a nearline bucket on google clound storage. The mb flag means create bucket. the -l flag means location options is ASIA, EU, US. For more options take a look in the &lt;a href="https://cloud.google.com/storage/docs/gsutil/commands/mb"&gt;gsutil mb doc&lt;/a&gt;&lt;/p&gt;</description></item><item><title>selenium as a service in osx</title><link>http://qone.io/posts/selenium-service-osx/</link><pubDate>Sat, 15 Aug 2015 00:00:00 +0000</pubDate><guid>http://qone.io/posts/selenium-service-osx/</guid><description>&lt;p&gt;Setting up selenium server as a service in os x that starts on boot. What is needed is a plist service file to start and take stdout and stderr and log the output. I will be using homebrew for install of the selenium server. in this example i have used a vagrant image of Yosemite you can find it here &lt;a href="http://files.dryga.com/boxes/osx-yosemite-0.2.1.box"&gt;vagrant image&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;A NOTE about the OS X Licensing.
Apple&amp;rsquo;s EULA states that you can install your copy on your actual Apple-hardware, plus up to two VMs running on your Apple-hardware. So using this box on another hardware is may be illigal and you should do it on your own risk.&lt;/p&gt;</description></item><item><title>docker machine and aws in combination with ansible</title><link>http://qone.io/posts/docker-ansible-aws-docker-machine/</link><pubDate>Sat, 13 Jun 2015 00:00:00 +0000</pubDate><guid>http://qone.io/posts/docker-ansible-aws-docker-machine/</guid><description>&lt;p&gt;Initial setup with ansible and &lt;code&gt;docker-machine&lt;/code&gt;. In this case i have been using aws to run a docker host for me created with &lt;code&gt;docker-machine&lt;/code&gt;. I had a goal to provision a ec2 box and have a few docker containers running in that ec2 vm. A note here is that the same way should work on any of the &lt;code&gt;docker-machine&lt;/code&gt; driver there is nothing unique with aws in the way i have done this. From what i know you cant NOT use ansible on boot2docker, i have assumed that there is a full linux vm running.&lt;/p&gt;</description></item><item><title>sqlite3 in memory db for testing db functions in python</title><link>http://qone.io/posts/python-sqlite-testing-clean-state/</link><pubDate>Sat, 06 Jun 2015 00:00:00 +0000</pubDate><guid>http://qone.io/posts/python-sqlite-testing-clean-state/</guid><description>&lt;p&gt;designing a app to be as testable as posible. When writing a application that you like to, write unit tests the best options it to always on everything written for a application to not be dependent on that the application is running. Since if we can import a function and muck or just test the isolated function. One example on this is a database function, and a database. In this example i will use sqlite3 as a example. sqlite have a option to start the db in memory. This is something is very useful since you like to have a clean state of the database for every test.&lt;/p&gt;</description></item><item><title>encrypted backup with arq amazon and s3 glacier</title><link>http://qone.io/posts/backup-s3-arq/</link><pubDate>Sat, 16 May 2015 00:00:00 +0000</pubDate><guid>http://qone.io/posts/backup-s3-arq/</guid><description>&lt;p&gt;As most that have any knowledge of IT and tech. Backup is a big topic also encryption now in the later days when there have been a lot of talk about that our data can be read if we are not using encryption everywhere. Now when most of os have most of our life in a digital form the information that we store in our phones, computers are more then ever a target for criminals and the data is important to us. Not only that electronics to break and when that day comes none will be happy if all there personal data, files, photos are lost due to a broken phone or computer. This is why backup is such a important thing. Doing backup right is not that hard. running around with a external drive or usb memory is not a god option, as a main backup. You will forget to do it, the best is to have one or multiple incremental backups that is running in the background so you don&amp;rsquo;t have to think about all the time. You should of course try to do restores from the backups every now and then to be sure that they are working and running as expected.&lt;/p&gt;</description></item><item><title>my travel gear</title><link>http://qone.io/posts/minaal-travel-gear/</link><pubDate>Sun, 03 May 2015 00:00:00 +0000</pubDate><guid>http://qone.io/posts/minaal-travel-gear/</guid><description>&lt;p&gt;Travel with a &lt;code&gt;Minaal&lt;/code&gt;. I am using packing cubes to organize my clothes. When i am packing clean clothes i am roling them and packing them in the packing cudes. This stops somewhat wrinkled clothes. The &lt;code&gt;Minaal&lt;/code&gt; it self is greate to have some weight in. Backpacks i have used before in a &lt;a href="http://missionworkshop.com/products/bags/backpacks/roll_top/large_vandal.php"&gt;Mission Workshop Vandal&lt;/a&gt;. That one is grate but it&amp;rsquo;s heavy. Load in the &lt;code&gt;Minaal&lt;/code&gt; works grate with a heavy macboook pro retina 15, since it&amp;rsquo;s close to the back the for some resoce it&amp;rsquo;s not fealing that heavy. Also having lot&amp;rsquo;s of small pockets to packing in is grate, Even some hidden ones like the one in the bottom of the main compartment. When going trough the airport security the laptop pocket makes it easy to just take out and put back again.&lt;/p&gt;</description></item><item><title>docker compose with selenium</title><link>http://qone.io/posts/docker-compose-selenium-hub/</link><pubDate>Fri, 24 Apr 2015 00:00:00 +0000</pubDate><guid>http://qone.io/posts/docker-compose-selenium-hub/</guid><description>&lt;p&gt;Running a local setup with selenium hub with &lt;code&gt;Firefox&lt;/code&gt; and &lt;code&gt;Google Chrome&lt;/code&gt; nodes using &lt;a href="https://github.com/docker/compose"&gt;docker-compose&lt;/a&gt; . Using &lt;code&gt;docker-compose&lt;/code&gt; makes the setup even more convenient the using the &lt;code&gt;docker --link&lt;/code&gt; commands. Starting and stoping the setup is just easy. Scaling up and down the number of Chrome and Firefox nodes is just a command to.&lt;/p&gt;
&lt;p&gt;&lt;code&gt;docker-compose&lt;/code&gt; builds a setup based on a compose file that is a yaml file. Creating the setup file for a selenium hub with two nodes can look like this.&lt;/p&gt;</description></item><item><title>a theory of productivity</title><link>http://qone.io/posts/theory-of-productivity/</link><pubDate>Sat, 18 Apr 2015 00:00:00 +0000</pubDate><guid>http://qone.io/posts/theory-of-productivity/</guid><description>&lt;p&gt;How to think about productivity. Lest take a example a math test. The test contains 20 problems, the total assumed time with no interruption to finish the test with all correct answers is 60 minutes. That is all god. Now lest assume that we mix in interruption in to the factor. At 5 times at random times you will get interrupted. The interruption will be to answer a random question, that will take 1-2 min to answer. You are then to return to the test. If we now assume that the interruption will be done so that you get interrupted when you are close to a answer. This interruption will result in 3 of 5 tests that you got interrupted no will be incorrect. For every interruption you also had to restart the current question you were on since you forgot the answer you almost had.&lt;/p&gt;</description></item><item><title>html email with aws ses and boto</title><link>http://qone.io/posts/aws-ses-python-boto/</link><pubDate>Sat, 11 Apr 2015 00:00:00 +0000</pubDate><guid>http://qone.io/posts/aws-ses-python-boto/</guid><description>&lt;p&gt;Sending html email with amazon ses &lt;code&gt;simple email service&lt;/code&gt; using the python module &lt;a href="https://github.com/boto/boto"&gt;boto&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;to be able to start using the &lt;code&gt;ses&lt;/code&gt; service you need to verify two emails. The first you what is the address that you like to use to send from. The second is the one you like to receive it in. To be able to verify that addresses you need a valid address since a validation email will be sent to the address. after that is done, you can start yo use the mail addresses. At this stage you will only have a sandbox version of &lt;code&gt;ses&lt;/code&gt;, you will only be able to send/receive from the verified addressed. at a later state you can request production that will open to send email to any one. When in the sandbox state of &lt;code&gt;ses&lt;/code&gt; you will only have a limit of 200 emails every 24h.&lt;/p&gt;</description></item><item><title>bulding a debian docker base build for a flask app</title><link>http://qone.io/posts/docker_basic_build/</link><pubDate>Fri, 03 Apr 2015 00:00:00 +0000</pubDate><guid>http://qone.io/posts/docker_basic_build/</guid><description>&lt;p&gt;Start by pulling down a debian base build. &lt;code&gt;docker pull debian:8.0&lt;/code&gt; .Creating a &lt;code&gt;Dockerfile&lt;/code&gt; for a base build to a python flask app. based on debian 8&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" style="color:#f8f8f2;background-color:#282a36;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;"&gt;&lt;code class="language-dockerfile" data-lang="dockerfile"&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;&lt;span style="color:#ff79c6"&gt;FROM&lt;/span&gt; &lt;span style="color:#f1fa8c"&gt;debian:8.0&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;&lt;span style="color:#ff79c6"&gt;RUN&lt;/span&gt; apt-get -y update &lt;span style="color:#f1fa8c"&gt;\
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;	&lt;span style="color:#ff79c6"&gt;&amp;amp;&amp;amp;&lt;/span&gt; apt-get upgrade -y &lt;span style="color:#f1fa8c"&gt;\
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;	&lt;span style="color:#ff79c6"&gt;&amp;amp;&amp;amp;&lt;/span&gt; apt-get install -y &lt;span style="color:#f1fa8c"&gt;\
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;		python-setuptools &lt;span style="color:#f1fa8c"&gt;\
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;		python-pip
&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;&lt;span style="color:#ff79c6"&gt;RUN&lt;/span&gt; pip install flask &lt;span style="color:#f1fa8c"&gt;\
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;				flask-restful &lt;span style="color:#f1fa8c"&gt;\
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span style="display:flex;"&gt;&lt;span&gt;				pymongo
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;build the docker base image. the &lt;code&gt;-t&lt;/code&gt; gives the build a name, the &lt;code&gt;--rm=true&lt;/code&gt; means &amp;ldquo;Remove intermediate containers after a successful build&amp;rdquo;, the &lt;code&gt;--no-cache=false&lt;/code&gt; means &amp;ldquo;Do not use cache when building the image&amp;rdquo;. the dot means look for the &lt;code&gt;Dockerfile&lt;/code&gt; in the current folder. using the flags together means that all of the steps in the docker build is done every time since i want to make the full docker build. Not just from a commit a few steps in to the build.&lt;/p&gt;</description></item><item><title>python flask rest api auth option storing in mongoDB 3.0 running on docker</title><link>http://qone.io/posts/python-flask-mongodb-restapi-auth/</link><pubDate>Sat, 28 Mar 2015 00:00:00 +0000</pubDate><guid>http://qone.io/posts/python-flask-mongodb-restapi-auth/</guid><description>&lt;p&gt;I have been looking on a way to auth rest api endpoints that needs to have auth. I will be using Python with flask, flask-restful, yaml, pymongo and passlib. passlib will be used to salt the password that will be stored in mondoDB. A note here is that i will be using mongoDB 3.0 which means that you have to install the un released pymongo 3.0. here is the packets and links you need.&lt;/p&gt;</description></item><item><title>productivity with Pomodoro and micro checklist</title><link>http://qone.io/posts/productivity/</link><pubDate>Wed, 25 Mar 2015 00:00:00 +0000</pubDate><guid>http://qone.io/posts/productivity/</guid><description>&lt;p&gt;I have recently started to use a productivity theory called &lt;code&gt;Pomodoro&lt;/code&gt;. The principe with Pomodoro is that you work focused on something for 25 min, having a micro break for 5 min and then continue. After 4 sessions you have a longer break for 15 min. I have been using this in combination with a &lt;code&gt;micro checklist&lt;/code&gt;. What i mean with &lt;code&gt;micro checklist&lt;/code&gt; is to have a check list like this.&lt;/p&gt;</description></item><item><title>pitfalls when writing commandline tools</title><link>http://qone.io/posts/command-line-tools-pitfalls/</link><pubDate>Sat, 21 Mar 2015 00:00:00 +0000</pubDate><guid>http://qone.io/posts/command-line-tools-pitfalls/</guid><description>&lt;p&gt;Common pitfalls when using template files/support files in the working dir of the script, is easy to miss when you are writing command line tools. In this case using &lt;code&gt;python&lt;/code&gt;. Some of the more annoying mistakes is when you have template files stores and referred to in a script, you assume that the person that will use you script stands in the folder were the script is. lest assume that we are using a file template.html in a folder named foobar. of corse this will not work if you are running the script from somewhere else. I normally stand in the same folder as the script when i am developing a tool. When you then stand in some other folder everything will fail since you are not giving the absolute path to the template files.&lt;/p&gt;</description></item><item><title>dynamic test generation</title><link>http://qone.io/posts/test-generation/</link><pubDate>Sun, 15 Mar 2015 00:00:00 +0000</pubDate><guid>http://qone.io/posts/test-generation/</guid><description>&lt;p&gt;I what to show a api template i did to generate api post tests with a random varible lenght hex value as the payload in a http post request. In this case i am using python, in compination with &lt;a href="https://nose.readthedocs.org"&gt;nose&lt;/a&gt;, to run my tests. Note you can run this without nose but you will get all prints from the support libs. Nose will supress the prints if the test passes. To change to varible number of tests that you can generate you just change the &lt;code&gt;random_data&lt;/code&gt; input integer. The purpose of this is to be able to check that a REST api can take the random input length of the hex payload. in the testmap you gets a dict that contains a random, int, float, hex, password.&lt;/p&gt;</description></item><item><title>stay up to date with tech</title><link>http://qone.io/posts/stay-up-tp-date-with-tech/</link><pubDate>Mon, 09 Mar 2015 00:00:00 +0000</pubDate><guid>http://qone.io/posts/stay-up-tp-date-with-tech/</guid><description>&lt;p&gt;Keeping up with tech. I like read and be aware of what is going on in tech, security, apple, linux, development, testing, and some devops related topics. I have attached a &lt;code&gt;OPML&lt;/code&gt; file from my &lt;a href="https://feedly.com"&gt;feedly&lt;/a&gt; where i get most of my news if your interested. I am frequently listening to a few podcasts, I&amp;rsquo;m using a awesome podcast iPhone` app &lt;a href="https://overcast.fm/"&gt;overcast&lt;/a&gt; to litsen. The smart speed saves some time, voice boost helps to boost the voice those is the features i think it best for me.&lt;/p&gt;</description></item><item><title>cold brew coffee</title><link>http://qone.io/posts/cold-brew-coffee/</link><pubDate>Sun, 08 Mar 2015 00:00:00 +0000</pubDate><guid>http://qone.io/posts/cold-brew-coffee/</guid><description>&lt;p&gt;Brew some easy and good cold brew coffee. This can be used for just plain ice coffee, ice latte, and many more. A note it that it will be high in caffeine. What you need is.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;a one liter jar&lt;/li&gt;
&lt;li&gt;coffee beans &lt;code&gt;newly roasted preferred&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;water &lt;code&gt;filtered preferred&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;a french press&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;My favourite coffee roastery is called &lt;a href="http://www.dropcoffee.com/"&gt;dropcoffee&lt;/a&gt;. it&amp;rsquo;s a small roastery in Stockholm. They always have something interesting to offer and always has some new coffee roast to try. Most of the roasts is lighter roasts look them up here is you like to try. &lt;a href="http://www.dropcoffee.com/"&gt;dropcoffee&lt;/a&gt;&lt;/p&gt;</description></item><item><title>control temptation @ work</title><link>http://qone.io/posts/control-temptation-at-work/</link><pubDate>Wed, 04 Mar 2015 00:00:00 +0000</pubDate><guid>http://qone.io/posts/control-temptation-at-work/</guid><description>&lt;p&gt;So what do i mean with &lt;code&gt;control temptation @ work&lt;/code&gt;. When working with development you have to get it to a problem and try to solve it. Doing that requires that you can stay focused on the task in hand to get the most out of it. Lets say you take 10 - 15 min to get it to the right flow. During your day you have email, facebook, online newspapers, chat systems like &lt;code&gt;IRC&lt;/code&gt;, &lt;code&gt;HipChat&lt;/code&gt;, &lt;code&gt;Skype&lt;/code&gt;, colleagues and many others things that can make you get out of your productive flow. The result of this will mean for every time you lose the 10 - 15 min to get in to the productive flow, over and over. And you may end up at the end of that day and realizing that you did not get done with what you wanted. That is not fun for anyone.&lt;/p&gt;</description></item><item><title>about</title><link>http://qone.io/about/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://qone.io/about/</guid><description>&lt;p&gt;This is a developer blog by a software engineer working cloud infrastructure and PaaS. It will be about things that is related to, software development, test automation, devops automation using docker, continuous delivery/integration using docker, and open source. In i write about things that i encounter at work, things i use, and find interesting.&lt;/p&gt;
&lt;p&gt;Most of my current interessts is in docker, prometheus, monitoring, ansible, python, go, AWS, GCE. that is most likely what will be in the comming posts for some time in the future.&lt;/p&gt;</description></item></channel></rss>