Skip to main content

Are communication students eligible for machine learning?

Communication students; meaning students of electrical communications and electronics engineering, are an elite stream of engineers who generally have more than average depth in mathematics, signal processing and probability theory. Today, in this video we are going to review the idea that whether communication students are eligible for machine learning even after that.

The good:

Communication students are generally taught 2-3 courses of engineering mathematics and 1-2 courses of compulsory probabilities and numerical approximations. Along with all these mathematics, handling the normal engineering physics courses as well as electronics calculations, they get a good exposure to the mathematical problem setting, problem solving, numerical algorithms and other several necessary tools needed for a machine learning expert.           
What is more important for further advanced experiences in machine learning is understanding of different mathematical objects, such as graphs, networks, series and others and being able to analyze them properly. And this is where being a communications engineer is more able than others. Having courses on signal processing, and grilling their hands on countless numbers of fourier transform breakdown; these things become easy by several degrees for the communication engineers when it comes to understand the different structures associated with machine learning algorithms and learning advanced machine learning algorithms such as time series analysis and others.

The bad:

But a significant part of the current machine learning practices have been the excessive use of different languages as python, r and julia and also expertise in understanding packages or writing codes on a proper level to interact with such tools. This is where a communication engineer lacks the best version. Handling mostly assembly language, micro-controller level or low level language codes like C, they are not best suited for writing fast paced, object oriented and statistical programming focused codes regularly. This is a part where engineers from other streams as computer science do significantly better; which is a worrisome thing in current competitive market.

The best:

Finally to conclude the point, programming is something of a earned skill, and therefore can be achieved over time based on self taught manners. So a communications engineering student who is willing to get into machine learning; has all the proper tools associated to machine learning with them. They know the math, they are capable of performing tough, low-level language programmings. All they need is a refinement, touch of machine learning like thinking and they are good to go.
So if you are communications student; and willing to join the exciting community of machine learning lovers, there have been no better time.

Comments

Popular posts from this blog

Tinder bio generation with OpenAI GPT-3 API

Introduction: Recently I got access to OpenAI API beta. After a few simple experiments, I set on creating a simple test project. In this project, I will try to create good tinder bio for a specific person.  The abc of openai API playground: In the OpenAI API playground, you get a prompt, and then you can write instructions or specific text to trigger a response from the gpt-3 models. There are also a number of preset templates which loads a specific kind of prompt and let's you generate pre-prepared results. What are the models available? There are 4 models which are stable. These are: (1) curie (2) babbage (3) ada (4) da-vinci da-vinci is the strongest of them all and can perform all downstream tasks which other models can do. There are 2 other new models which openai introduced this year (2021) named da-vinci-instruct-beta and curie-instruct-beta. These instruction models are specifically built for taking in instructions. As OpenAI blog explains and also you will see in our

Can we write codes automatically with GPT-3?

 Introduction: OpenAI created and released the first versions of GPT-3 back in 2021 beginning. We wrote a few text generation articles that time and tested how to create tinder bio using GPT-3 . If you are interested to know more on what is GPT-3 or what is openai, how the server look, then read the tinder bio article. In this article, we will explore Code generation with OpenAI models.  It has been noted already in multiple blogs and exploration work, that GPT-3 can even solve leetcode problems. We will try to explore how good the OpenAI model can "code" and whether prompt tuning will improve or change those performances. Basic coding: We will try to see a few data structure coding performance by GPT-3. (a) Merge sort with python:  First with 200 words limit, it couldn't complete the Write sample code for merge sort in python.   def merge(arr, l, m, r):     n1 = m - l + 1     n2 = r- m       # create temp arrays     L = [0] * (n1)     R = [0] * (n

What is Bort?

 Introduction: Bort, is the new and more optimized version of BERT; which came out this october from amazon science. I came to know about it today while parsing amazon science's news on facebook about bort. So Bort is the newest addition to the long list of great LM models with extra-ordinary achievements.  Why is Bort important? Bort, is a model of 5.5% effective and 16% total size of the original BERT model; and is 20x faster than BERT, while being able to surpass the BERT model in 20 out of 23 tasks; to quote the abstract of the paper,  ' it obtains performance improvements of between 0 . 3% and 31%, absolute, with respect to BERT-large, on multiple public natural language understanding (NLU) benchmarks. ' So what made this achievement possible? The main idea behind creation of Bort is to go beyond the shallow depth of weight pruning, connection deletion or merely factoring the NN into different matrix factorizations and thus distilling it. While methods like knowle