How do I fix TypeError: unhashable type: ‘list’ Error?

Estimated reading time: 3 minutes

When programming in python you will come across this error quite often, in this case is quite easily fixed once understood.

The problem usually arises when you try to loop through a dictionary with key-value pairs. If you are unsure what a dictionary looks like see W3 Schools

Lets examine those loops that don’t throw the error.

Using a list, produces the following output with no error:

list = [['a'],['b'],['c'],['d'],['e'],['f']]

print(type(list))

for i in list:
    print(i)

<class 'list'>
['a']
['b']
['c']
['d']
['e']
['f']

If there is a need for a tuple, then the following outputs with no error:

list = (['a'],['b'],['c'],['d'],['e'],['f'])

print(type(list))

for i in list:
    print(i)

<class 'tuple'>
['a']
['b']
['c']
['d']
['e']
['f']

Using a dictionary, it gives the error you are looking to resolve, but why?

list = {['a'],['b'],['c'],['d'],['e'],['f']}

print(type(list))

for i in list:
    print(i)

list = {['a'],['b'],['c'],['d'],['e'],['f']}
TypeError: unhashable type: 'list'

To understand the error it is important to step back and figure out what is going on in each scenario:

(A) Looping through the list, it looks at the values on their own, thus the loop completes with no problem.

(B) As with lists, Tuples are immutable ( cannot be modified), more importantly, they can be looped through with no error.

In this case the lists have single values, the dictionary above has only one value, it expects two, hence the error.

How do we fix this error going forward?

The simplest way is to loop through a list of single items with the iterable code below:

fixlist = [['a'],['b'],['c'],['d'],['e'],['f'],['f'],['c']]

# Converts fixlist from a list of lists to a flat list, and removes duplicates with set
fixlist  = list(set(list(itertools.chain.from_iterable(fixlist))))

print(fixlist)
Result : ['d', 'f', 'c', 'b', 'a', 'e']

Now your code is only looking to loop through some single values within your list, compared to dictionary key-value pairs.

Approaching solving this problem through an iteration line by line helped to pinpoint the problem.

Consequently the steps I went through to fix the problem involved:

(A) print(type(variable)) – Use this on passing data to see what the data types are, clarifies if this is the problem.

(B) Consequently once the line of code that was throwing the error was found, removing the dictionary fixed the problem.

Or

If a dictionary is required to be looped through, it needs the proper key, value pairs setup.

Conclusion

In conclusion, in order to remove this error it is important to identify the line and or lines, that have a dictionary and covert them to a list

or

if a dictionary is needed ensure that the lists are converted to a dictionary with key, value pairs.

If you would like to see a very good video explanation of this error head over to Brandon Jacobson’s YouTube channel , and make sure to subscribe.

His explantion is below:

What is data analytics?

Estimated reading time: 4 minutes

In recent years data analytics and the importance of data analysis has gone up significantly.

As a result of this, the quantities of data now been processed has increased.

Here in Data Analytics Ireland, we will look to explain the concept, please share or link to this article so others can see its contents.

To begin with, the reason for this is that the digital economy has taken off, in particular:

(A) Automation of tasks has become easier.

(B) Less use of paper, becoming a greener economy.

(C) Technology improvements have meant that storage and big data processing make the process of delivering services easier.

(D) Career opportunities for professionals with good skills have increased.

(E) A wide range of open source and paid tools are now easily available that help to process and report on the data.

(F) Entry requirements are easy, and this coupled with an ability to quickly acquire knowledge and skills helps entry whether you want to be full time or part-time.

(G)Knowledge and skills have improved as access to online learning has improved significantly.

As a result of all this:

(A) Large data volumes need to be analysed.

(B) Consumers’ habits about how they use a service or the information they look for now has a digital footprint.

(C) Now once a consumer has used a service ( whether purchased or not), the ability to understand their habits can be captured to deliver as follows:

  1. The services they want.
  2. The products they want.
  3. Quicker turnaround time.

How can this help with all the data that is captured and stored?

So in this article, we have already outlined what the background is as to how the industry has evolved to where data analytics is now.

As outlined, all information traditionally would not have been stored in a format that was easily accessible.

Step 1 – Data Capture

To understand what you want to analyse, and help draw conclusions accurately, a data analyst will work with their technical colleagues to ensure that the correct data is captured.

Data capture of raw data, can happen in a number of ways:

  • User input.
  • Interaction with a website or application.
  • Consuming a service.
  • Requesting a service be completed.
  • Social media interaction.
  • In a lot of cases now this happening in real time.

Once the completeness and accuracy are fulfilled, your data quality will become less of an issue.

Step 2 – Analysing

(A) You create visual charts of it; this allows the viewer of the information to get an initial view of the information without looking at the underlying data. Sometimes this will show patterns in data or clusters or the types of data you capture.

(B) Using data science statistics to see if they can explain the data. This could show information such as how data is correlated or otherwise. Also, probabilities could be calculated to show what outcomes might happen in the future.

(C) Data analysts might also need to understand how to build a machine learning model to use complex algorithms, to explain the data better, sometimes patterns that are not immediately understood can be unearthed and investigated further.

Step 3 – Presenting

An emphatic NO is an answer, but this is where some of the visualisation tools come in!

Visually presenting data points, very quickly allows a viewer of the information to come to a decision quickly, and the tools that are outlined below will help with that process.

These are a handful that will allow the data to be sliced and diced, there are many more out there, but they all allow data to be drilled down into and get to a real understanding what is going on.

Some of the tools include TableauPower BI, and Python( it has libraries that do a nice job)

Step 4 – Decision Making

So after all this analysing, there needs to be decisions made:

(A) Do you have the data in order and in the correct format?

(B) In a place that it can be accessed and reviewed.

(C) Relevant to when the decision needs to be made.

From the outset, as part of the work of performing the data analytics, an assessment needs to be made as to how often a decision will need to be made, with what data, and when.

At this point, the decision-makers should have a set of data ready for them to look over and reliably make a decision

Based on the data returned, if they can’t make a decision then possibly, steps 1-3 should be reviewed and revisited.

Often what happens, is what information that was required to make a change needs to be updated, or improved upon.

It is the job of the teams that manage the data sets to source that data and or change how they present it, to now reflect the decision that needs to be made.

data cleansing in a business environment

I was on LinkedIn recently, and looking at my profile, saw a post I had posted about six years ago around Data Cleansing. One thing that struck me was that the topics I brought up then are as relevant then as they are now, and with Big Data now mainstream many companies are wondering how to manage all this data in an ever-changing landscape. So I thought I would share it again.

The Business  case

(A)Test Data meets Industry requirements.

In some industries, it is a legal requirement to have all your data displaying the correct format and description. Any pieces of information not included should be removed based on business rules. Today companies operate across multiple platforms electronic, print, video, etc., a process needs to be in place to make sure the data is in sync!

(B)Check for unwanted words appearing

Branding and reputation are critical, and businesses large and small need a mechanism to understand what information was written online in conjunction with any of their profile. Data Cleansing can be the first point of call to unwanted words that will damage the brand.

The Technical case

(A)Remove unwanted characters such as !”£$%^&*@’;:#~?>< etc.

When presented with a set of data from another source, they may be in a raw format, and if looking to grouping the words or numbers in your dataset, this can sometimes lead to wrong grouping.

(B)Group Data

Sometimes you will need to group specific names or numbers to see how often they appeared. Having this issue can become problematic if an initial review of the data was not started, as with point A above. So you want to check for Facebook in your dataset and if there are six occurrences of it 4* Facebook! and 2*Facebook, your grouping will be incorrect, giving you the wrong analysis.

(C)Prepare Data

The main reason for cleansing data is to have it ready to be processed. Often or not, the process is an initial step before further processing starts. The hard processing will have built-in controls to make sure the data is in the correct format if not, they will fail. This step would be crucial in an automated process.

(D)Check for Null Values

Sometimes a system can be set up to process data or receive in data from a third party vendor. It might be imperative that specific fields should not be empty or should be empty depending on the business need. The initial analysis to identify those values through the data cleansing process should help to mitigate any problems before the data gets loaded into systems that have strict controls on them.