1  Fundamentals

Welcome to Intermediate Python! Each week, we cover a chapter, which consists of a lesson and exercise. In the first week, we go over the goals of the course, and review data structures and data types that you have seen before from Intro to Python. We also look at some new data structures and more properties of data structures.

In Intro to Python, you learned how to do basic data analysis such as subsetting a dataframe, looking at summary statistics, and visualizing your data. This was done in the context of a clean, Tidy dataframe. In this course, we focus on working with data “from the wild”, in which the data comes in a more messy, un-Tidy form. Let’s see what we will learn in the next 6 weeks together:

1.1 Goals of this course

  • Continue building programming fundamentals: How to use complex data structures, create custom functions, and how to iterate repeated tasks.

  • Continue exploration of data science fundamentals: how to clean messy data using the programming fundamentals above to a Tidy form for analysis.

1.2 Motivation

We will be looking at a dataset from Twitter that looks like the following:

Tweet_ID Username Text Retweets Likes Timestamp
1 julie81 Party least receive say or single…. 2 25 2023-01-30 11:00:51
2 richardhester Hotel still Congress may member staff…. 35 29 2023-01-02 22:45:58
3 williamsjoseph Nice be her debate industry that year…. 51 25 2023-01-18 11:25:19

Suppose that we want to do some text analysis on the “Text” column: We want to assign a sentiment score (a numerical value that measures the emotional tone or attitude expressed in text) to each tweet, based on all of the words it contains. For instance, a tweet about celebrating one’s birthday will be assigned a positive sentiment score, and a tweet about getting fired from a job will be assigned a negative sentiment score.

If we have a function that takes in a String of words, and output a sentiment score, that would be great. However, that function does not exist in the built-in libraries of Python and Pandas, so we will have to write our custom function!

When we think about writing a custom function, we usually like to sketch out an outline what the function will do in English, and then try to translate it to Python code.

Given an input String of words,

  • Examine each word in the input string:

    • Associate the word with a sentiment score.

    • And keep track of this sentiment score.

  • Take the average of all the sentiment scores,

  • and return it as the output.

How do we associate a word with a sentiment score? We need another function to do that:

Given an input String with one word,

  • Load in a “lookup dictionary” that assigns words to scores.

  • Check whether the input string is in the dictionary. Some words, such as “the”, won’t have a sentiment score.

  • If so, return the score of that word.

Some concepts from this outline that we will learn the technical details include: writing a custom function, iterating through a data structure, and using “lookup dictionaries”. Let’s look at the Learning Objectives of the course:

1.3 Learning Objectives

  • Understand and distinguish the use case of data structures to store different types of data.

  • Implement code to Iterate over a collection (such as files, elements of a column, or a list of objects) to batch process each item.

  • Implement code that has a branching structure depending on input data’s condition.

  • Create simple, modular functions that can be reused.

  • Describe the difference between copying an object vs. referencing an object and how that could affect variables in a data analysis.

1.4 Data types in Python

To get started, let’s recall the primitive data types in Python:

Data type name Data type shorthand Examples
Integer int 2, 4
Float float 3.5, -34.1009
String str “hello”, “234-234-8594”
Boolean bool True, False

There’s a special data type called None in Python, in which is used as a placeholder. We will talk about it later this course.

1.5 Data Structures

And fundamental data structures:

  • List

  • Dataframe

  • Series

  • Dictionary

  • Tuple

We will look at our new data structure, the Dictionary, carefully today. You will learn a little bit about Tuples in your exercise.

1.6 Objects in Python

All of our Data Structures are organized under the Objects framework in Python. For each data structure type, we can examine:

  • What does it contain (in terms of data)?

  • What can it do (in terms of functions)?

And if it “makes sense” to us, then it is well-designed Object.

Formally, an object contains the following:

  • Value that holds the essential data for the object.

  • Attributes that hold subset or additional data for the object.

  • Functions called Methods that are for the object and have to take in the variable referenced as an input.

This organizing structure on an object applies to pretty much all Python data types and data structures.

Let’s see how this applies to the Dataframe:

Suppose we have the following Dataframe:

import pandas as pd

simple_df = pd.DataFrame(data={'id': ["AAA", "BBB", "CCC", "DDD", "EEE"],
                               'case_control': ["case", "case", "control", "control", "control"],
                               'measurement1': [2.5, 3.5, 9, .1, 2.2],
                               'measurement2': [0, 0, .5, .24, .003],
                               'measurement3': [80, 2, 1, 1, 2]})
simple_df
id case_control measurement1 measurement2 measurement3
0 AAA case 2.5 0.000 80
1 BBB case 3.5 0.000 2
2 CCC control 9.0 0.500 1
3 DDD control 0.1 0.240 1
4 EEE control 2.2 0.003 2
  • Value: the contents of the Dataframe, which is a tabular data format in columns and rows.

  • Attributes that allow one to access subset of the data or additional data:

    • simple_df.id access the column “id”, returning a Series object.

    • simple_df.shape access the the number of rows and columns.

    • Subsetting via the bracket .iloc[row_idx, col_idx] or .loc[row_idx, col_idx] notation.

  • Methods that can be used on the object:

    • simple_df.head() and simple_df.tail() access the first and last few elements of the Dataframe, respectively.

    • simple_df.merge(another_df) merges simple_df with another Dataframe another_df.

We have some of our favorite attributes and methods of Dataframes from Intro to Python here.

To be even more precise, even primitive data types, such as an Integer, is considered an Object in Python. We just typically access its Value, but don’t interact with its underlying Attributes and Methods.

1.7 Dictionary

Today, we will introduce a new data structure, called the Dictionary. A dictionary is designed as a lookup table, organized in key-value pairs. You associate the key with a particular value, and use the key to find the value. You should consider using a dictionary when you are storing a collection of associations. Another way of saying this is that dictionaries are useful for storing and manipulating correspondence relationships.

For instance, suppose that we want to associate common English words with a sentiment value:

sentiment = {'happy': 8, 'sad': 2, 'joy': 7.5, 'embarrassed': 3.6, 'restless': 4.1, 'apathetic': 3.8, 'calm': 7}
sentiment
{'happy': 8,
 'sad': 2,
 'joy': 7.5,
 'embarrassed': 3.6,
 'restless': 4.1,
 'apathetic': 3.8,
 'calm': 7}

If we want to find the sentiment value of a word, we can look it up immediately via its key to access its value:

sentiment['joy']
7.5

However, we cannot access the nth element of a Dictionary, as we are able to do with Lists and Series:

#sentiment[0] error!

1.7.1 Basic Rules of Dictionaries

Here are some basic usage rules of Dictionaries:

  • Only one value per key. No duplicate keys allowed.

  • Keys must be of string, integer, float, boolean, or tuple.

  • Values can be of any type, including data structures such as lists and dictionaries.

If duplicated keys are given, then the last unique key is kept.

duplicated_keys = {'Student' : 97, 'Student': 88, 'Student' : 91}
duplicated_keys

It is quite common to have data structures within a dictionary. Notice that when we create a Dataframe from scratch, we give it a dictionary, where the column names are keys and columns are values. A Dataframe is built on top of a dictionary with more tools!

1.7.2 Basic Usage of Dictionaries

You can modify values of a corresponding key in a dictionary:

sentiment['joy'] = sentiment['joy'] + 1

You will get an error if you try to access a key that doesn’t exist:

#sentiment['dog']

Alternatively, if you don’t want to run the risk of getting an error, you can specify a default value using the .get() method. Here, we give a default neutral value of 5 if the key doesn’t exist.

sentiment.get("dog", 5)
5

If you don’t specify a default value, and the key does not exist, you will get a special None data type.

print(sentiment.get("dog"))
None

You can add more key-value pairs via my_dict[new_key] = new_value syntax. If the key already exists, the mapping for that key will simply be updated.

sentiment['dog'] = 5

1.7.3 Dictionary vs. List

Both data types help you organize values, and they differ how you access the values. You access a list’s values via a numerical index, and you access a dictionary’s values via a key.

Source: https://open.oregonstate.education/computationalbiology/chapter/dictionaries/

1.7.4 Application for Data Cleaning

Suppose that you want to do some data recoding. You want to look at the “case_control” column of simple_df and change “case” to “experiment” and “control” to “baseline”. This correspondence relationship can be stored in a dictionary. You can use the .replace() method for Series objects with a dictionary as an input argument.

simple_df.case_control.replace({"case": "experiment", "control": "baseline"})
0    experiment
1    experiment
2      baseline
3      baseline
4      baseline
Name: case_control, dtype: object

1.8 Converting between data types

Often, we need to convert between data types and data structures. You should consider whether the conversion is:

  1. Permissible
  2. Whether any information will be lost

1.8.1 Data types

You can convert any number to a String.

age = 24.5
str(age)
'24.5'

Let’s try to convert a String to a Float:

age = "24.5"
float(age)
24.5

But it is not permissible to convert to an Integer, as we don’t know what to do with the decimals (we comment out code that will error, so that this page will render).

#int(age) returns an error

And we cannot convert some Strings to any number.

car = "prius"
#float(prius) returns an error

Sometimes, we need to pay attention whether any information is lost in the conversion. Let’s convert Float to Int:

temperature = 98.6
int(temperature)
98

Notice that the conversion dropped the decimal point entirely.

1.9 Converting between data structures

When we look at a column, it is of the Series data structure.

simple_df['measurement1']
0    2.5
1    3.5
2    9.0
3    0.1
4    2.2
Name: measurement1, dtype: float64

Let’s convert it to a List:

simple_df['measurement1'].to_list()
[2.5, 3.5, 9.0, 0.1, 2.2]

If you look at the documentation of Series, there’s a lot of other conversions you can do, in the .to_*() methods, such as .to_string().

When making these conversions, you might ask why isn’t the column of a Dataframe just a List instead of a Series. The answer is that there are useful values, attributes, and methods about a Series that are more useful for data analysis compared to a List. You can compute .mean() to get the average value of a Series or .plot() to make a simple plot, but these methods doe not exist for a List. Series are also designed to compute on large datasets more efficiently than Lists. However, Lists can store elements from various data types, and can store Lists within Lists. When we make conversions, we think about what data structure is more appropriate than the other, which is a big theme of this course!

1.10 Is my variable a data type/structure?

Often, you need to check whether your variable is a specific data type or structure. From Intro to Python, you learned about the type() function, such as:

type(simple_df)
pandas.core.frame.DataFrame

This is great, but the output of type() can be rather verbose, and is usually useful for printing and testing scenarios. To have a more concise, robust way of checking, we prefer the isinstance() function:

isinstance(simple_df, pd.DataFrame)
True

This directly reference the Object’s type, which is more clear.

isinstance(simple_df, list)
False

1.11 Exercises

Exercise for week 1 can be found here.