Starting your own Portfolio - Generating text with GPT2

9 September 2020
Marta López
Marta López

Head of Marketing and Communication

Do you want everyone to try out that cool ML model you made to classify cats and dogs? Or do you want to be able to deploy your model to create your own portfolio? Whether you want one or the other, you might be interested in the webinar we held a few days ago with Alejandro Díaz, Machine Learning Engineer currently working in Australia, where we learned about the importance of a portfolio in the programming world. We also developed and deployed an application to start creating our own portfolio.

Along with this article you can watch the webinar by portfolio creation and also access the presentation.

A portfolio is much more than a CV, in fact it is proof that you are capable of doing what you mention in your CV. It is also extremely differentiating when it comes to your next interview.

Is it really worth it?...Without a doubt! Although it is true that it requires time and patience and must be maintained over time, but the result is worth it. As we said before, a portfolio that includes all your projects can make a difference in a job offer. It also allows you to create your own brand, your own identity. 

You don't need to have 200 projects, contributions to open-source projects, to have worked for Bill Gate himself... None of that. You just need to review the projects you have worked on or tested (even the TensorFlow object-recognition tutorial), time and desire to be able to present them in a clear and elegant way in your new portfolio. And if you think you don't have any project, the mere fact of doing it is already one! ? But don't worry because we will develop an application for you to include it in your portfolio and learn how to display your ML models so that everyone can try them out! 

In the webinar we saw how to implement an application to generate text using the famous model GPT-2. We use flask to create our service and docker to encapsulate it, finally we deploy the service in Cloud Run (one of the Google Cloud serverless options). Here you can check the final result. ?

The first thing we did was to set up an account with Google Cloudin addition to downloading the Google Cloud SDKi.e. the command interface (cli). We simply follow the instructions in this tutorial in order to set up our account.

Why Google Cloud and not AWS or Azure?

Cloud Run allows us to deploy and expose our service without the need for another tool. We could have deployed it in AWSbut it's a bit "complicated" to deploy Docker images in functions. AWS LambdaIn addition, we would have needed to use API Gateway to expose the service. It is also a possibility to deploy it in AWS Fargatebut in that case we would have needed a container orchestrator. (ECS or EKS) and billing is per second. In the case of AWS Lambda and Cloud Run, billing is per 100 ms blocks, typically AWS Fargate is used in cases where the compute time is longer.

In order to use Cloud Run we need to enable the API from the Google Cloud console, to do this we simply go to Cloud Run and create a service (you don't need to create the whole service, just click on "Create Service"The API is already activated).

Now we can start to develop our code, for this we have created a template that we completed in the webinar, in case you want to have the complete code you can download the master branch and follow the instructions to execute it, but we advise you to watch the video of the webinar?

The moment of truth

Once we have the code finished and tested in our local, we can push our Docker image to Google Cloud. To do this, simply run the following lines in a terminal:

docker tag text-generator-gpt2:latest eu.gcr.io/text-generator-gpt-2/text-generator-gpt2:latest

docker push eu.gcr.io/text-generator-gpt-2/text-generator-gpt2:latest

It usually takes a few minutes, but when the process is finished we can go to Cloud Run, remember that it is important to check "Allow unauthenticated invocations" for our service to work.

If all has gone well, the service will have been deployed correctly and you will now have you will have the first project that you can include in your own portfolio! ?

IMMUNE Technology Institute, with the collaboration of Spanish Startups, has prepared a Datathon on 19 September where you can learn and attend talks very similar to this one with the best experts in Data Science.
In addition, we have organised several masterclass y Data Science Challenges where you can put your knowledge to the test and best of all, you'll be able to they will have a prize!!!

Sign up here!

Wait, one last thing

Keep it going! Due to the demand for information about our Master Data Science, we have organised an information session on 24 September with Mónica Villas, director of the programme, to answer any questions.

IMMUNE can help you to boost your career through its partner companies and contacts with recruiters and professionals in the sector, do not hesitate to sign up if you want more information about our programmes..

Sign up here!

This article was written by: Alejandro Diaz Santos- (LinkedInGitHub)

 
Subscribe to our newsletter
menuchevron-down