Running Your Neural Network on AWS DLAMI – Repost by Ramsey Elbasheer
Original Source Here
Running Your Neural Network on AWS DLAMI
Why you should stop running your neural networks on your personal computer and use the computational tools available at your fingertips
Deep learning has taken the machine learning by storm for the last several decades and has rendered many prior models obsolete. From natural language processing to image classification, deep learning is widespread in its application and comes in a variety of different flavors. it’s no wonder than that so many aspiring data scientists and machine learning enthusiasts immediately gravitate to this part of the field.
If you’ve ever spun up a large enough neural network on your personal laptop then you know just how computationally intensive training these networks can be. One of the benefits of using cloud computing is you wont hear your computer fan whizzing anymore while you train your network. Cloud computing does not use your computer’s CPU to run your models, but instead uses a portion of the computing power from whichever provider you are using. This means that rather than leaving your computer plugged in and running all night, you can simply execute the task on the cloud instance, shut your computer, and check its status in the morning.
There are a variety of different cloud computing platforms you can choose from. For this post I’ll be going through a specific option on the largest of these providers: Amazon Web Services. To get started using AWS, you will need to set up an AWS account if you don’t already have one. (https://portal.aws.amazon.com/billing/signup#/start) This will ask for your credit card information, but wont charge you unless you start using higher caliber computing resources than the ones in this blog post.
Once you are set up with an AWS account, sign into the AWS Management Console. From the Services dropdown menu select EC2 under the Compute Category. You should see a Resources box like the one above pop up. Click the Instances Running Tab followed by the Start Instances Tab.
From the AMI page, you will see a host of options for your instance. This post focuses on DLAMI, or the Deep Learning AMI. This particular AMI comes preinstalled with a multitude of different environments for machine learning. Scroll down until you see the Deep Learning AMI option with Ubuntu. I’ll use the Ubuntu 18.04 option. Press Select and choose the t2.micro option on the next screen. This is a very simple instance type that is free tier eligible and will work for some very basic neural networks. If you are planning on running a larger network and need to beef up your computational power I suggest the c5.large instance. This will however cost you a bit (about $0.085 per hour) so make sure you don’t forget about it and leave it running for a month. Once you are done with choosing your instance, click through until you get to the “Configure Security Group” tab.
This is the tab where you can decide which connections to allow to your instance. For this example, leave this as it is, but if you are interested in additional connections this is where you would add them.
You’re now ready to spin up your DLAMI! Click Review and Launch. Once you’ve reviewed all of the settings click the Launch button. This will bring you to the final step which is to select a key pair. The key pair will be dowloaded as a .pem file which you will need in order to access your instance. Keep you key file somewhere safe. If you lose it, you cannot get another it back and will need to spin up a new instance. For this reason, I suggest keeping it safe. If you are working with git, don’t keep your key pair inside your local repo. There is nothing worse than git pull eating your key pair. Trust me, keep your key file outside the repo. You’ll thank me later. Also, don’t push your key pair to GitHub. This will allow anyone access to your instance, which will attract bad actors. Keep your key pair local and never upload it to the internet.
Once you’ve downloaded your key pair and launched your instance you’re good to go. Head back to the EC2 console and wait for the Instance State to say Running with the green checkmark. (This may take a few minutes) Click on the Instance Id hyperlink which should bring you to a summary page for your new instance. Once you have this page up, it’s time to head to the command line.
chmod 400 path/to/.pem/file
ssh -i path/to/.pem/file ubuntu@<Public DNS>
Open up terminal and use the above command to access your instance. Your public DNS can be found on the instance summary page. If everything works correctly you should find yourself logged into the instance.
You’re in! You should notice from the login screen a list of different environments availble right off the bat. All of these come fully loaded with backends and all which saves a ton of hassle compared with other EC2 instances and is what makes DLAMI so appealing. Before we begin working in one of these environments though, we should run everything in a separate tmux window.
$ pip install tmux
$ tmux
$ source activate tensorflow2_p36
This code snippet will open up a tmux window with the associated environment running in it. It’s important not to enter this environment before switching to the tmux window as this can cause errors. From here, use pip to install any additional libraries needed and import your model using either scp or by cloning from GitHub. From here you are good to go! To return to base, press option+b+d. I suggest saving your tmux shell as something rememberable.
tmux rename -t 0 deeplearning
tmus ls
You’re good to go! You are now running your neural network on DLAMI. Make sure to terminate it when you are finished.
AI/ML
Trending AI/ML Article Identified & Digested via Granola by Ramsey Elbasheer; a Machine-Driven RSS Bot
via WordPress https://ramseyelbasheer.wordpress.com/2020/12/12/running-your-neural-network-on-aws-dlami-repost-by-ramsey-elbasheer/