Thoughts on Machine Learning

My first blog post in 2017, yey! It was about time. Can’t believe almost two months have passed already. The first 2 weeks of this year have been quite hectic for me, with 3 important coursework deadlines, which drained most of my energy. Then one more, which made me a temporary “NLP word embeddings and LSTM” expert and after that I finally took a 4-day vacation to the incredible Iceland (I’d love to write a post on that trip soon).

However, long story short, I’m in the second half on my second MSc term and I can gladly say I am now much more mature about it. I sometimes like to personify my Masters. So in the first term we were like strangers, pretty much scared/terrified of each other. With time, after submitting my first “wave” of coursework and getting more than good feedback, I started to gain more courage and realized “well, maybe this isn’t that bad”. Some incredible goals I’ve set, slowly started to materialize, which gave me an even greater feeling of being able to tackle the impossible (it always seems impossible until it’s done – N. Mandela). I still don’t understand how this could be, but for me it is empirically determined that whenever I truly wish for something from the bottom of my heart, it happens.

So, I wanted to write this post to lay down some of my current views on machine learning from my present standpoint. There is no doubt they will change. I’d just like to keep track of what I think now, how I view it and how I’ll develop my knowledge and perspective on the topic in the future. Be aware, this is just a high level reflection, which is meant to be more humorous than informative. So let’s begin!

By many, machine learning is viewed as a major, growing branch of artificial intelligence. Some like to call it pure statistics. I say it’s on the edge. Machine learning algorithms have been developed from late 50s, evolved throughout the decades to neural networks in the 80s and so on. They looked good on paper, but they were impractical to test. As you already now, there just wasn’t enough computational power to support the theory. Quick parentheses: after watching the Space Odyssey 2001 film recently, I was amazed how developed the views on artificial intelligence were in the late 60s.

Fast forward to 2017, thanks to Moore’s law, we have incredible computational power and not only that, we have today’s golden resource: data. Well, when I say “we have”, I basically mean a few lucky big corporations have truly meaningful and useful data. The more data you give to a machine learning algorithm, the better it will generalise to new, unseen data you’ll give it. Just as a human brain, if it only knows one language, it will be impossible for it to grasp the meaning of words from a different and unrelated one, no matter how well it knows that one initial language.

I’ll make another parentheses here for a funny story (which I consider relevant and I like) : a friend of mine was on a plane reading a book in Romanian. An English man sitting next to my friend asks after a while “What language is that? Is it Romanian? Because I don’t know it”. Surprised by the specific question and the hypothesis, my friend replies “Is Romanian the only language you don’t know?”. The English man laughs and explained how he reached his conclusion. He noticed it looked a bit Slavic, but also very Latin and it had a Roman alphabet. I particularly liked how he pondered on the matter, connected some dots, made some assumptions, probably made a few guesses in his mind and decided to evaluate his final prediction. And this again makes me wonder how computers will be able to “think” like this. But wait a second, they already do. Based on different criteria, but they do a pretty good job. Google or Bing Translate already guess the languages you type, without you specifying it. The problem now is, we see it as normal from a computer to be able to do that, but quite spectacular from a human. Here is the point where I’d probably philosophize. It’s interesting, isn’t it?

Cool, so why is there so much statistics and mathematics behind machine learning? Well, because of data. Whenever you have a lot of raw data, you want to make some sense out of it so you call upon statistics, which calls upon some maths. A Statistics superhero will come around and say “Hey! I know some formulas that will help you understand the mean of this data, a standard deviation, you’ll draw some pretty cool distribution graphs and find some pretty cool numbers. Do you want to do this on paper? Or should I call my friend, the Computer superhero? He knows some pretty cool shortcuts”. Then of course, for a 5 by 6 table of data you’d probably do a pretty good job on your own. But when you’re given 10GB of data, you might have no other choice than to call your Computer superhero to the rescue. So cool, he now helps you out with your data set. But his superpowers are limited. It has a lot of memory and does calculations really fast, but it requires a human to give it some algorithms (a set of instructions, like recipes) and it will produce a result (the final meal) – could be bad or good, depends how skilled you are as a “cook”. If you tell it to keep the food in the oven for 2 hours and then 1 more, the instructions will probably destroy your end result. Bottom line, what a machine learning specialist will know, is how to be a 5-star Michelin chef. He’ll know for the specific data set, what ingredients from the machine learning toolbox he needs to complement it nicely, distinguish its patterns and highlight it. It will also know how to use the hardware at hand: CPUs, GPUs to make sure they’ll get the desired, expected, perfect end result in time.

I’d end this post by paraphrasing a really cool quote I read on being a data scientist nowadays. And it says that a data scientist is better at statistics than a software engineer and better at programming than a statistician. However, it is equally true that he is worse at statistics than a statistician and worse at programming than a software engineer. Different viewpoints, same idea.

Hacktrain EU

What an experience! As soon as I heard about this amazing hackathon, I knew I had to participate! Hacking on a train for 48 hours from London to Paris, from Paris to Lyon and back sounded incredible. And it was! It was SO much fun!

Of course, the theme was hacking the rails and this was the hack’s third edition! Organised by Hack Partners, this event is more than just a fun weekend. At the end of the hack, 10 interesting projects get selected and go into an accelerator program, becoming real start-ups in less than a year.

Two amazing examples from last year, include Busybot and Vivacity. The first one is integrated in the trainline app and allows customers to find where empty seats are on a train so that they can wait next to that coach. A very convenient feature for peak hour commuters. Vivacity, on the other hand, is developed by a group of Cambridge graduates and offers state of the art video analytics. With the use of machine learning it can also predict numbers of people at a station, boarding or alighting the train, queues at the ticket office and so on.

This year, each sponsor provided challenges and data sets for the participants. Out of 600 applicants, 80 hackers were selected to take part and we were split between 2 trains : HackTrainUK and HackTrainEU. I was lucky to go on the EU one, having the opportunity to see my beloved Paris once more (if only for a few of hours). The four main sponsors were Arriva, Eurostar, TfL and SNCF, joined by EY, BAE systems, the Department of Transport and many more.

Our journey started at St Pancras, taking as in.. wait a minute.. Business Class!!! to Paris. It was amazing! As the teams were already formed, this 2 hour trip offered us the chance to brainstorm ideas and organise our projects. I was really surprised that as soon as we entered France, the train’s speed reached 185 miles/hour. It was crazy!

I was part of a team of 3 hackers and we chose to focus on allowing train companies to improve their communication with clients during disruptions. Helping them to also store and make predictions for future events regarding the number of passengers impacted by such events.

For this, we used TfL’s data set from the Tramlink service, where they use door sensors to count how many people board and alight the trams at each station. This was great for us and we managed to mine the data and observe patterns of commuters and periods when trams are disrupted and how many people are on those trams, as well as how many people are waiting to board at next stations. These are some screenshots of our prototype, which will allow the train controllers to view the passenger data regarding each train passing a station.

Screen Shot 2016-11-10 at 09.55.54

Screen Shot 2016-11-10 at 09.55.20

The winner of this year’s edition was FlexiRail and they developed an app that allows users to opt for flexible train times so they’ll be offered a ticket on the most convenient route to avoid overcrowding. Runner ups, AutoMapr, tackled the most technically challenging data set, the LiDAR data, offering machine learning integration for visualising train wires. What they managed to do in 48 hours was amazing! The topic is usually one for research groups at Princeton University.

The weekend was packed with amazing ideas, people and stories! I would strongly encourage other people to join the Hacktrain next year as it truly is a unique hacking experience!


Microsoft Codess Event

Also published as a guest post on the Codess blog here.

Where do I begin? This was an amazing event!! From beginning, until the very end, I enjoyed every second of the evening. So, huge congrats to the organisers!

Codess is Microsoft’s global community for female coders. They organise events around all major Microsoft campuses, allowing women in this industry to network and find inspiration.

Last week’s event was entitled Future Technology and had 3 incredible speakers. Each talk was literally mind-blowing and it covered exactly the topics I am most interested in at the moment. Chance, luck? Who knows?

Haiyan Zhang, innovation director at Microsoft’s Research campus in Cambridge spoke about the amazing progress made on treating cancer. I was fascinated by the talk and it made me feel overall much much more optimistic about the future. It’s unbelievable to think that programming will be able to help us find a cure for cancer. Just mind-blowing! The video she presented explained how a cell can identify itself as a cancer cell and start destructing automatically.

Then Bunmi Durowoju, senior business development manager at Microsoft, spoke about my absolute favourite topic. Connecting the 4.5 bln. unconnected people. She spoke about the unused TV white space (UTWS), explained in more detail on the in their post from 2012 and on the Microsoft Whitespaces page. This is an amazing way of delivering broadband and has already been implemented in countries such as Jamaica, Namibia, Tanzania or the Philippines.

The last talk came from Maria Rakusanova, virtual reality lead and senior product marketing manager at Samsung. She talked about Samsung’s latest developments in regards to virtual reality and their Gear VR, powered by Oculus. What amazed me about it is the sheer number of applications it can have. One example that I am sure applies to a lot of us is allowing people to practice public speaking. With a virtual reality of a huge audience in front of you, it brings preparing for presentations to a whole another level! More useful applications, however, include educational experiences, created, for example, to allow medical students to attend live surgeries via virtual reality. The other industries where it finds good applications are numerous and diverse. For movies, entertainment or fashion, they can film concerts or events, using the Samsung Gear 360, a camera that enables anyone film their own VR experiences (a competitor to GoPro’s 180 degree view). While at the event, Maria allowed us to test the Gear VR and I managed to “sit” in the next room with Elliot from Mr. Robot. Great experience!

Overall it was an amazing evening, complemented by amazing people! I am really looking forward to their next Codess event and I plan on doing more research on Microsoft Research’s progress on programming the DNA, as well as the whitespace spectrum technology. This is real technological progress in areas that really need our full attention at the moment. I can reduce them to simply – treating cancer and empowering developing countries.



Facebook Intern and Grad Summerfest Event

On a nice July Wednesday evening, Facebook made an incredible invitation to 12 University/Masters/PhD students to have dinner and enjoy a couple of hours at their London offices in Warren Street. I felt really lucky to be offered this chance and it was truly an amazing experience. It allowed me to see what life at Facebook feels like, the culture they have and the way they tackle day to day challenges.

First off, we had dinner with a group of Facebook engineers and recruiters who answered our questions about their roles, the application process and their careers so far. They’ve been really helpful and positive about their roles within Facebook and emphasized the amazing hacking culture they have.

Then we had a great tour of the three floors, exploring the interesting rooms, their naming themes and finding out about their usual “office traditions”. It was a great evening and I am happy I had the chance to explore the offices and meet incredibly interesting people!



ABI Local GHC1 London

What an amazing conference! I don’t remember the last time I’ve been so happy! The GHC1 was the first of its kind in London and after attending the US GHC last October, I can tell you, this one was just as good and offered me the same “oh my God, how amazing these women are” kind of feeling!

The reception on Tuesday evening was an incredible opportunity to network and meet exciting women in technology. Held at Google’s offices in Soho, I had an amazing time and felt great that I recognised so many faces and got the chance to chat with old friends.

Then on Wednesday, even before the opening keynote, I had the chance to mingle in the career fair, where companies such as Facebook, Google, Microsoft, Bloomberg or Palantir had their own booths. Then Sarah Wilkinson, CTO of UK’s Home Office, opened the conference and shared with over 250 attendees, priceless career advice. She particularly outlined the following 4 ideas:

  • believe in yourself
  • don’t be afraid of failure
  • work for people who believe in you
  • believe in each other

Even though they may sound stereotypical, the way she supported each claim with sound real-life examples, made the entire speech so inspirational and motivational. She has seen a lot of women step back from a chance in their career just because they didn’t believe in themselves.

After that, the conference was divided into 3 main tracks : technical, entrepreneurial and leadership. Before I left for my team’s day out (cooking classes in Soho), I attended 2 technical track talks, about emerging technologies. One discussed 3D printing and one explored an individual’s experience with learning new technologies and viewing the entire process based on the software craftsmanship principles. I promised myself I’ll explore these principles in more detail. This book is a very good starting point – The Software Craftsman : Professionalism, Pragmatism, Pride, by Sandro Marcuso, a book from Uncle Bob’s series.

The ending keynote was Katharine Zaleski, CEO and Co-Founder of PowerToFly. She created this amazing platform that empowers women to get in touch with hiring managers all over the world and find job opportunities that align with their flexible work hour needs. In her speech, she outlined her career path and the benefits she sees in companies hiring more diverse people.

I can’t wait to join the next one and I feel so incredibly lucky to live in London and be part of such an amazing community of women in technology!

JSConf Budapest


I was so excited about this conference, you have no idea! It represented a great opportunity for me to get introduced to the incredibly warm JS community. Since my personal goal for the first half of 2016 was to perfect my JS skills, this conference represented an amazing chance for me to get up to date with the latest trends, news and practices and also meet people from whom I can learn.

I was lucky enough to be awarded a scholarship to attend JSConf and I am extremely thankful to the organisers and the NGOs that created this possibility for me. I’ll highlight in this post some of my key learnings from my favourite talks, without forgetting to mention the fact that I received an over dose of inspiration that allowed me to start my very own creative project (I love it!!).

First of all, the “master of ceremony”, as the organisers called him, Jake Archibald (@jaffathecake) has been an incredible incredible MC! All his introductions or interventions on the stage were super super funny. I particularly liked his call for programming/movie puns, which revealed some hilarious ideas : “Chrome Alone 2”, “Planet of the APIs”, “Batman returns true”, “Forrest Gulp”, “The usual SASSpects”, “Gruntzilla” and much much more. He also advised participants in the beginning that “being on fire is highly discouraged”.

Safia Abdalla (@captainsafia) had a great talk on memory management in JS. A topic I didn’t think of much before the conference. She discussed the V8 JS engine only and explained how it works and how it allocates memory under a heap structure. If you’d like to find out more about this topic, here’s a webpage/course she created related to the talk. Next, Yan (@bcrypt) had a great and very relevant talk on encryption and the move to HTTPS. It was great to find out about the Let’s Encrypt project, an open-source initiative from EEF, Mozilla and the University of Michigan that automates the TLS config process and issues HTTPS certificates for websites at no cost.

Denys Mishunov (@mishunov) discussed website performance as the users’ perception of how a fast website might behave. It encouraged developers to move a bit away from overthinking metrics and focusing on how the users’ brain perceives the difference between a website that loads in 700ms and one that loads in 800. He stated that a website needs to be at least 20% so that the users can notice any difference. Then Nick Hehr (@hipsterbrown), a front end developer from Brooklyn, NY ended day one with a great talk on empathy in the developers community. He talked about encouraging each other and offering constructive feedback as often as we can.

Day two started with an incredibly inspirational talk from Suz Hinton (@noopkat), an Australian front-end developer based in NY. She talked about a creative project she developed on her own, recording sounds in the subway, transforming notes into pixels and then editing the resulting image to form beats and a nice funny web app. Her story motivated me to go on and put in practice my own creative idea and I can’t tell you how happy I am to have started worked on it 2 days ago. I plan to finalise it soon, by the end of this month.

Then Nicolás Bevacqua (@nzgb) gave another great talk on site performance. This talk did focus on improving metrics by optimising TCP, HTTP, HTML, CSS, fonts, images and of course, JS. He recommended High Performance Browser Networking, a book by Ilya Grigorik, which sounds like a very good read to get an in-depth perspective on this topic, which I am sure will prove to be super useful for any future project.

A great talk was offered by Rob Kerr (@robrkerr) as well. He is a researcher at IBM Watson in Australia and he showed how JS helped him and his team develop some incredible data visualisation tools that were very useful in their research on neurons. One of the things he mentioned in the beginning of his talk is that, indeed, the web was created at CERN for the scientific community. It is great to find out that it still complements researchers’ work.

As the last talks, Claudia Hernandez (@koste4) explored JS particularities in a very funny Alice in Wonderland themed presentation, while Lena Reinhard (@lrnrd) sent a very powerful message about debugging the tech industry. She referred to the problem that is between the keyboard and the chair and talked about the diversity and inclusion challenges the industry is facing at the moment.

I am so so happy and thankful for being offered this incredible opportunity and I cannot wait to start taking some action and using the learnings and inspiration I gathered to develop my own amazing projects.

Render Conference

The Render conference, 21-22 April, that took place in Oxford, was an incredible experience! I’ve learnt so many new cool things and met amazing people.

The opening speaker was Bruce Lawson (@brucel), deputy CTO at Opera, who gave a very funny and entertaining talk on the challenges that the web is facing, in particular – native apps. Where web lost the battle against native is called the land of UX. Progressive web apps have so many advantages compared to native ones that it’s imperative to invest in very very good UX. Probably one of the biggest arguments supporting web is the app size. With limited memory on the device, a user would consider “hey, is this app more important than my photos collection?”. An example of the difference is an app that can be 1MB on web and 403MB implemented as native. Next, they live on the server, so there is no update distribution lag and they can now be functional offline, due to the service workers.

Speaking of service workers, another amazing talk came from Jake Archibald (@jaffathecake), developer advocate at Google. He gave a great walkthrough on how to use them in your app, with a visual confrontation in offline mode and very very slow Wi-Fi, which he called Lie-Fi. Lie-Fi is indeed even worse than Wi-Fi, since you get to stare at a white screen for an indefinite time. Whereas, when you’re offline, at least you get a message saying that you’re 100% offline, you’ve got no connectivity whatsoever. His talk made me realise just how important it is to think about offering users a good experience when they’re offline. You can always smartly cache, save their input and alert them that their actions have been successfully performed as soon as they’ll be connected again (e.g. Facebook messenger’s chat messages). For example, when I showed a friend of mine that he can play a Mario-like dinosaur game in Chrome, when he’s offline, he was fascinated. There is also a great short course on Udacity, my favourite MOOC platform, on offline web applications. It shall receive my full attention soon!

Even though I aim for simplicity in an app’s design, Val’s (@vlh) talk on integrating animation smartly was amazing! I had no idea how much you can achieve only with CSS and how much support there is in the browsers’ dev tools for testing and debugging them. Smart, subtle animations can make a huge difference to your app, compared to static elements. They come in and enhance the UX, without the users even noticing them and help communicate your business’ brand and personality. She also gave a great tip on the best book used by great animators and it comes from no one else but the masters of animations: 12 Principles of Animation from Disney : The Illusion of Life.

Another talk I liked was given by Alicia Sedlock (@aliciability) and it was on front-end testing. She went through all available options, including unit tests, automation, acceptance, visual regression, accessibility and performance testing. She emphasised the need to prioritise testing on the front-end side and offered examples of great tools for each category. Jeremy Keith (@adactio) also offered an incredible talk at the end of the conference, discussing resilience in the world of web. Emphasising progressive enhancement, his talk was a real inspiration for a young developer like me. He brought great arguments in favour of this approach, which is summarised in his slide below:


What was new at RenderConf was that, as a scholar, I’ve been assigned to a guide during the conference. I’ve had Seb (@seb_ly) as a guide, an amazing conference speaker, graphic designer, passionate about animation and building stuff from scratch. He showed me some of his coolest latest projects and I’ve been particularly impressed by this one on Laster Light Synths.

So these were the best moments I enjoyed from the 2-day conference, but there were much much more I can write about endlessly. I also had the chance to meet Todd Motto (@toddmotto) and after the conference, I started his  course on Angular JS, which is absolutely incredible so far! I’ve been caught up with final coursework and exams, but I’ll finish it soon.

Facebook Hackathon

This weekend has been an amazing experience! 24 hours of non stop coding and fun. The Facebook team offered us the most amazing time, with great food (all the time), drinks, snacks, music and raffles, everything you ever wanted.

I met my team on the day and we quickly brainstormed ideas before the hack officially started around 12:30 on Saturday. I was very keen on developing a project for social good. So I researched before the hackathon the initiative and their commitment to make Internet accessible to everyone in the world. Their guidelines for developers suggested eligible apps should not include rich data such as video, high resolution images or anything that would require large HTTP responses. As a result, we thought, what is text only and awesome? Code!

So we decided we should make an app to teach children in developing countries to code by sending them short challenges that they could solve, either via the mobile app or offline, through text messages. So if you would send a piece of code in a text message, it would compile on the server and a reply would be sent with the output of your code or an error message if something wasn’t right.

For that, we thought we should create a new programming language, easy to understand for children. We called it Lilo (from Lilo&Stitch) and it complies to Python on server side and also maps to JavaScript on client-side. We thought of a funny and childish syntax for our new language. A variable would be a box, since you can store “things” in a box. Boxes will be labelled with names, just as a normal variable, will be manipulated and they would be able to store any type of object – numbers, strings, boolean values etc. Boxes are also resizable and can store more than one object. In terms of commands, to output something we defined say ‘..’, as a natural command to allow your code to say something. Instead of functions, we decide to call them dogs. A dog is usually called and can return something you’ve instructed it to. A dog knows how to do only one thing and when required, it “fetches” you what you need in your program. Moreover, a dog is happy to get called by you anytime you need it. Loops and conditional statements would stay the same in Lilo, while a line end will be marked by a dot, as it is easier to type in a text message than a semicolon.

The Lilo to Python compiler was written from scratch during the hack. For the SMS processing we used Twilio, a cloud service that handles this sort of interaction with the server. We’ve tested the system from multiple phone numbers and it always works, just like in my screenshot below. The hashtags #bc and #ec just stand for begin code and end code, needed in parsing the content of the text message for testing purposes.


On the frontend side, when Internet was available, we created the Ionic app, with a simple interface for 9 basic lessons and Facebook authentication. Since it addresses children, we also integrated a story, based on Facebook’s sticker character, Biscuit. It would teach children how to code through funny stories. For example, to explain the loops, we imagined the student and Biscuit are playing hide and seek, so the student had to come up with a program in Lilo to count form 1 to 10. When we introduced box assignment, we said Biscuit is giving the student 2 boxes and they have to fill them in with their name and phone number and label them so that they won’t forget what’s inside. The code would be syntactically checked in a live code editor inside the app, CodeMirror, based on JavaScript. Also, the code was mapped client-side from Lilo to JS, so little to no server-side requests had to be made.

In addition to all this, on the app’s Facebook page, users can post a snippet of Lilo code and receive the output of the compiled code as a comment to their post. This is making use of Facebook APIs and scrapes the page for new Lilo code posts every 30 seconds.

Screen Shot 2016-03-14 at 14.13.16 (2)

Even though we didn’t win the hack, it was amazingly fun! My team was purely awesome and we had a great time hacking and sharing stories! The winning team developed a new Terminal experience for developers, enriched will all sorts of new features.

The project is available on GitHub and you can access the app here (it has bugs, we know, but it’s a hack!). Here are some screenshots and photos I took during the hackathon.

Screen Shot 2016-03-14 at 13.48.39 (2)

Screen Shot 2016-03-14 at 13.49.37 (2)

Screen Shot 2016-03-14 at 13.54.56 (2)






The aftermath:


The team:



My presentation on Celebrating Women in STEM

On International Women’s Day, I have been invited to talk about my passion for computing science and the topic of women in computing at my University’s event “Celebrating Women in STEM”.

It has been a pleasure to speak on behalf of the newly formed Women in Computing society, which has great prospects! I am happy to hear from third and second year undergraduates, who show great interest in continuing my initiative!

I have uploaded my presentation on SpeakerDeck, so that anyone can see it and find any useful information.


Visit to the National Museum of Computing

Yesterday I’ve been on a super trip organised by my University’s Computing&Engineering department to the National Museum of Computing in Bletchley Park. I’ve visited the Bletchley Park a year ago and I found it really interesting – all the Enigma machines, stories about the female codebreakers and Alan Turing. Everything was really inspiring. However, I didn’t visit the computing museum, which is right behind the Bletchley park entrance.

This time, we were lucky because we had our own private tour of the museum, with 3 lovely guides, who were extremely passionate about computing. Their careers started in the mid 60s, so they experienced a large part of the computing evolution, from huge mainframe computers and memory disks to the mobility and power we now have today. By faaar the best part of my day was the opportunity to code in Basic, on a microcomputer from 1981, which only had 32Kb of RAM and the ability to display just 4 colours: red, yellow, black and white (which are basically non-colours, but heh). We were given 28 lines of code in BASIC that would create a Snake game. If we were lucky enough not to encounter any errors, we would have to build on top of the game, adding new features, changing control keys, adding obstacles and so on. I also programmed the game so that I could never lose. It wasn’t as fun as I expected. Moreover, I think I didn’t take as much care about typing some code in my whole life. If you pressed a key like “Break” by mistake, your entire ‘work’ would be gone. When typing a new line of code, you also had to include the actual line number. And they had to be in super strict order. If you wanted to copy and paste a line, no Ctrl+C Ctrl+V, unfortunately. Arrow keys and the copy key were your friends. There wasn’t any mouse either. I can’t believe how fun it was, given all the limitations! It was back to BASICs!

I will leave you now with some of the photos I took there, not very professional, but still, they log some very interesting things I’ve been lucky to see there.

Below is the oldest operating computer in the world and on the left hand side you can see all its memory.


This is a radio station, used in the Second World War:


A British “Google Maps” from BBC, developed in the 80′s, stored on discs.


The BBC Microcomputer from 1981, with 32Kb of RAM and the lines of code written in BASIC for the Snake game: