How to Get Zillow Property Details with Python 2023

In this video, I will show you how to get all property details such as beds, baths, lot size estimate, and more all using Python for free. This will help you to retrieve data at scale for fast and accurate analysis.

Ariel Herrera 0:00

Are you looking to retrieve property details data for a market tired of copying information manually to evaluate a single deal? While you're not alone, when I started real estate investing in 2019, I was deep into spreadsheets but realized it was slow, cumbersome and error prone to analyze properties. This is where I discovered API's to automatically retrieve property data. In this video, I will show you how to get all property details such as beds, baths, lot size estimate, and more all using Python for free. This will help you to retrieve data at scale for fast and accurate analysis. My name is Ariel Herrera, your fellow data scientists with the analytics aerial channel where we bridge the gap between real estate and technology. I love bringing easy to use data driven solutions that you can apply to your business. Please like this video so that we can reach a wider audience and subscribe to this channel so you don't miss out on the latest content. All right, let's get started.

So if you're coming from the last video, we went over how to use the property details and listings tool. In this example, we came in as investors and investors wanting to evaluate a particular deal. This deal is three 611 Potter Street in Tampa, Florida Tampa is my Market. And this property is located near a nightlife and daytime area called Ybor City that has restaurants and lots of fun activities. So the way we were able to evaluate the steel is we signed up for the Zillow scraper API. Once we did, we went over to the streamline app that I created within the streamlet app we entered in our email address. So mine is Ariel Herrera at coffee closers.com. Then I selected property detail go. And this retrieved my latest searches. If I select the first one, you could now see the same exact property, we have information like price per square foot also calculated, we can go all the way down to the bottom, look at taxes paid price history, as well as comparables, we could see where these comparables are, too. And if we want to download the entire dataset of comps or have property detail, to have values like Zestimate, rent estimate, lot size, how many searches How many saves property has had and more, we could do that as well. But what if you want to get this data for a multitude of properties. Or if you want to integrate it as an API maybe into your Podio system. A useful language to do this then is Python. So I'm going to walk you through now step by step how to retrieve this data programmatically. And again, if you're new to programming, feel free to check out my free series of introduction to Python on my channel. So now going straight into the API. This is Zillow scraper by scrape peak. I'm an affiliate of scrape peak but not the creator. Therefore, if you have any questions particular to the dataset, please ask them directly. And Zillow scraper offers US and Canada real estate data, you can search millions of for sale and rental listings. In this case, we'll be able to get data both for on and off market properties. So those that are for sale and those that are not for sale. Now you can come down below to look at API documentation. But first, make sure you sign up with the link below, you will be able to use 100 Free searches or 1000 credits on a monthly basis. Once you sign up, you'll be able to come to your account. In your account, you will go to plans, you will select the starter plan. And you can go up in your plan as you'd like. Now, for account settings. This is where you're going to find your API key. your API key is towards the bottom. This is like your secret key or password that will allow you to query the data, copy the key. Our next step is we want to start working with the API. So we're going to go into Google collab. Google collab is a free notebook environment where you can code programmatically without having Python installed on your machine. So right now I'm on Google Drive, you can create a free Google account. Then if I right click, I could see I could start up Google Docs, Sheets, Slides forms, and there's a more option. Now I already have Google collaboratory installed. If you do not you can select connect more apps. It takes a second just to download Google collab for free. Now once you can see this on your right click, click it and You will create a new notebook, you can title your notebook, anything that you'd like. I'm going to put demo for property details. If you're new to Google collab, again, I suggest to watch the series of introduction to Python. But as a quick tutorial, you can share your file with others, you can leave comments, so it's collaboration friendly. And you could also file download to your drive, or even save a copy into GitHub. So now we have our API key. Let's set our API key to a variable. So I'm going to go back to my account, copy it, and paste it within the string. Once you have your API key copied, you can press the play button. This will now save your API key as a variable, and we could reference it later. So if we go to the Zillow scraper to API documentation, we could see there's different endpoints think of endpoints almost as like a folder structure. And we want to retrieve data between the different files. So particularly, we want to get data on the property itself. So property details, it is required that we have an API key, and we had the Z PID, the ZP. ID is the unique identifier for a property that Zillow gives it. So let's copy this over into our notebook. So let's put this here Ctrl V. And we could see we need to input our API key and our Z P ID, we can set this to a URL and put this into strings. So to get our Z P ID, which will be our second variable, we can go back to the property. And we can copy the Z PID from up top in the URL, it's going to be towards the end with underscore Z PID. And we can add that in as a variable. So I'm going to put Z PID equals and put it into a string. Now we want to enter these two variables into our URL. So we could do dot format. And here from the first element, so zero, I'm going to put in the API key variable. Then for the Z PID. One, I'm going to enter Z PID. Next, we want to make a request or request to get the data. So we're going to add a code import request, this will be the library that we use to get our data. So request dot request, let's say get, it's going to be a get request. As we look at the documentation, it says get, then we're going to input our URL and run this. But before we run it, we want to set it to a variable. So let's put this as response equals. So that's going to run that to retrieve the data for that property of that CPE ID using our API key. If we now look at response, we could see 200, that means it's a success. But if we want to get more detail, we could transform this into a JSON object. As an a JSON object, we get all the information back. It's kind of overwhelming. So let's just look at the keys of what data we can extract. So we see that we have is success data and message. So if we just want to look at whether this query was successful or not, we can index it with is success. And we'll see the answer is true. Now, if we want to index by message, we could see scraped successfully. And lastly, if we want to look at the data, we can view here all of our data, which has the information for the property. Personally, I like to think of data as rows and columns. So let's now transform this into a table. We're going to use pandas, which is another library within Python, so run up top import pandas as PD, then we can add a new line. Or we could just work with the one that we currently have, since this is pretty long. So let's go back to the top. And here we're going to wrap this NPD. So for pandas, JSON underscore normalized because we're normalizing our JSON file, wrap this and let's call this DF prop for DF property. Let's run it and it's run successfully. Let's view the contents. So DF prop, and we could see we have one single row with 602 columns that is super wide of a data set. We can see all the columns though, because they're blocked by this.dot.so. What we could do is add in towards the top, an option that will allow was to View all columns. So here I'm gonna put PD dot Set option, display Max columns. And this is not something I've memorized, I've just gotten into Google. And I can search How to View all columns of pandas. And then I'm able to copy the code over and paste it, a lot of programming is searching and copying other code, and then bringing it all together. So there's not a lot of memorization to it. Now, if we run this exact data frame, we're going to see now instead of the dot, dot, dot, all of the columns, we have information on property details, like bedrooms, bathrooms, price, and so much more. If you can think of it and you've seen it on the actual listing site, that you can probably find it in one of these columns. But in particular, there are just certain columns that I am curious about initially. So what I could do is say, print street address. And I could locate for this data frame, the street address,

column, and locate just the first elements in that first row, there's only one row. So that's pretty simple. And we could see here, street address three 611 Potter Street. Now I've already gone through looking at these columns. So I'm just going to copy over printing out some other statements and running it. So here, we could see that we have multiple columns that help us to get more context on the property itself, like City State Home status, which this one is for sale, bedrooms, bathrooms, year built, Zestimate, rents estimate and description. Awesome. So what you've been able to do in this tutorial is sign up for an API key. Read the documentation to understand how to make a get request. Open up a Google collab notebook, start to write code to retrieve data. Now, if you had a multitude of XEP IDs or properties that you need to get data for, then you could use my property details bulk upload tool. It's currently in version one and may have version two already released by the time this video comes out. You can upload up to 1000 rows of property data and receive details within a short span of time. I love to hear your use cases, looking at property data, are you building a tool that you'd like to share on the channel? Or perhaps you're looking at research? And maybe you're a data scientist as well? I love to hear your use cases below. And if you haven't already, please like and subscribe. Thanks

Transcribed by https://otter.ai

Previous
Previous

How to Get Zillow FSBO Properties with Python [2023]

Next
Next

5 Questions Beginner Wholesalers Can Ask ChatGPT