I'm really happy with it and will definitely be taking part next year :)
So it's been a while since Hacktoberfest and I had actually forgotten about the t-shirt that you get for taking part, but the other day it arrived! And there it is!
I'm really happy with it and will definitely be taking part next year :)
0 Comments
A while ago I posted a blog post about a hackathon project I did for work. In this project it required me to run multiple SSH sessions to many machines at once to query the state of the system. Recently I've been listening to a lot of the Talk Python To Me Podcast, and hearing all about the new async features of Python 3. This got me thinking about this project and how this would be the perfect candidate for some async functionality. Why is this? TimeoutsSo we had a fairly large number of machines that we were wanting to run these SSH Commands on and many of those machines were simply not turned on or currently unavailable for SSH connections. This caused an issue as we simply had to wait for the timeout to occur which was down to about 1 second to ensure we gave enough variance for slow network. During this 1 second window each of the worked threads were simply waiting there counting down the milliseconds. This was extremely inefficient! Stateless (Hey it worked out)So Python introduced the async and await keywords but unfortunately it is likely going to be a while before there are many ORM's that are async capable. This is really handy as I had decided that the API was going to be fairly stateless other than the in memory cache, which means implementing this new approach should be fairly simple. ApproachSo I used Flask for the web framework for the API and there is now a drop in replacement being developed called Quart. This enables the async and await functionality into that same Flask development style that everyone loves with Flask.
Along with Quart is another python package called asyncSSH which allows the use of async and await for SSH Connections. So the actual changes required to test out this new approach hopefully shouldn't be too much effort. The big question is will this actually improve performance? At work we occasionally have a Hackathon day where everyone in the lab is able to work on any project they like whether that's just practising some Test Driven Development(TDD) or creating something that could help with your day to day work. The IdeaSo for this Hackathon I got together with Adrian Moldovan and Tom Fletcher to work on a project that would hopefully address a major issue we have at work. Our problem is that our product has many different platforms and we don't have all the different platforms in each of the teams so we often have to share our hardware. The problem with this is that in juggling the hardware between teams some hardware can often get lost along the way. The major problems we have are:
Finished ProductAs you can see we have quite a variety of kit on many different levels and this information is invaluable for quickly finding the piece of hardware you require. TechSo let's talk about the tech that has been used for this project. Firstly I've been reading a lot about microservices and was intrigued by this concept of splitting out what would normally be on large application into smaller portions. By no means is this a microservice based architecture for this project as it lacks some of the required infrastructure to be that way. This project is split into two smaller pieces:
Python Flask REST APIMultithreadingSo the basic idea for the project was to run many of the same ssh command to all of the machines and get a point in time capture of all of the state required to solve the above problems. To achieve this we decided to use threads to be able to kick off multiple ssh sessions at once. I tend to avoid using multi-threading whenever I can because it massively increases complexity but in this situation it seemed like the correct thing to do. Stateless-ishSo I really liked the idea of the REST API being a really simple worked that simply ran commands on multiple machines and output the result. I decided that we should try and keep this fairly stateless and so the only real state that the API has is a simple in memory cache which wouldn't even be necessary if it wasn't for the sheer number of machines that we are trying to perform the commands on. The caching itself is only held for 15 minutes for each machine to ensure the data isn't too stale. React FrontendSo this portion of the project is where all the shiny things happen, without this the user would need to be able to easily read and interpret JSON data which is not a skill many people have. Search and FilterFor ease of use we added a search functionality along with some filter buttons which will allow the user to quickly narrow down the choice that they are after. There is also a refresh functionality which will call to the API and tell it to do a full refresh not including cached data. This is mostly for people who really need the most up to date information. InfrastructureDockerFor this project we really wanted to try out using Docker as we wanted to be able to deploy these two applications as simply as possible. After all this was a Hackathon and so it is a perfect opportunity to try and harness new skills. Both projects were there own separate containers and the frontend simply needed an argument on run which passed in the address for the API. Lessons LearnedI think the project went really well and everyone we presented it to was really impressed and wanted to see what else we could get out of it. ArchitectureI think the biggest things I learnt were to do with application architecture specifically how this style of application level separation of concerns allows for extendable and reusable functionality. The API itself can be used by any frontend client. If React is no longer the flavour of the month then a new frontend could be made, or if someone wanted this functionality on mobile then they could still simply call the same API. ContainerisationThis whole idea of containerisation I now see if incredibly powerful, it is much easier developing on the same platform that you are deploying to and also this ability to deploy anywhere that has just the one dependency of being able to run Docker is so useful. I will definitely be doing a lot more with Docker in future. ImprovementsThere are always improvements that can be made, specific ones I would like to look into are:
At work I recently helped organise a Hackathon for the Manchester IBM Lab. I wanted to take some time and do a retrospective of the projects that my team worked on. Our team consisted of Toby Fleming, Manuel Cantu Reinhard and myself. We ended up doing two projects, one which was more of an electronics project and the other which was a software development project. GDI (Graphical DNS/DHCP Interface)Our team created a web application to allow IBMers to add hardware to the lab infrastructure with DNS/DHCP configuration. Prior to the Hackathon a lot of preliminary work was put in to create a git repo, with a Vagrant/VirtualBox VM and Ansible provisioning to setup the development environment. This allowed our team to get straight to work on the application which would have been unachievable without this preliminary setup. The application itself was a Flask Python application. The front end had Flat UI/Bootstrap CSS with some LESS alterations to adjust the look and feel. We also used Javascript to allow some dynamic content addition to the forms. This project isn't quite as exciting from a visual point of view as our second project, so I will just get on to talking about that. DaaS (Doughnuts-as-a-Service)We have a doughnut rota at our office that means every Friday whoever is down for that week has to go and buy some Krispy-Kreme doughnuts for the lab. The steps to do this are:
We thought we could do better than this so we came up with DaaS! We first came up with the idea that when any sugary treat is delivered to the kitchen there should be a way to instantly let everyone know that they have arrived. Nobody wants to be waiting an extra ten minutes for someone to get to their desk and compose an e-mail to be informed of doughnut arrival. Getting Started.... We set off to work hooking up the raspberry pi with the flick switch and "Big red button" that we had ordered. At various points through the day we stopped to discuss design decisions and any problems with a whiteboard session. After hooking up the electronics and doing some simple tests we started working on the implementation. We used a simple python GPIO library for interfacing with the flick switch and button. The Final ProductSo here is what it looks like fully working with the switch activated(obviously the Pi goes inside the box). I'm actually really proud of this as it was a pretty cool project to work on.
Thanks. |
Archives
May 2020
Categories
All
|