About this Tutorial:
Whats inside?
In this notebook, the basics of data-intake are introduced.
- Data will be imported using Colabs Terminal Commands then load this data into pythons pandas
- We will import geospatial data from Esri then load this data into geo-pandas.
- A variety of data formats will be imported.
Objectives
By the end of this tutorial users should have an understanding of:
- Importing data with pandas and geopandas
- Querying data from Esri
- Retrieveing data programmatically
- This module assumes the data needs no handling prior to intake
- Loading data in a variety of formats
Background
For this next example to work, we will need to import hypothetical csv files
Try It! Go ahead and try running the cell below in Colabs
Advanced
The Function
class Intake: # 1. Recursively calls self/getData until something valid is given. # Returns df or False. Calls readInGeometryData. or pulls csv directly. # Returns df or False. def getData(url, interactive=False): escapeQuestionFlags = ["no", '', 'none'] if ( Intake.isPandas(url) ): return url if (str(url).lower() in escapeQuestionFlags ): return False if interactive: print('Getting Data From: ', url) try: if ([ele for ele in ['pgeojson', 'shp', 'geojson'] if(ele in url)]): print('gothere', url) from dataplay import geoms print('gothere1') df = geoms.readInGeometryData(url=url, porg=False, geom='geometry', lat=False, lng=False, revgeocode=False, save=False, in_crs=2248, out_crs=False) elif ('csv' in url): df = pd.read_csv( url ) return df except: if interactive: return Intake.getData(input("Error: Try Again? ( URL/ PATH or 'NO'/) " ), interactive) return False # 1ai. A misnomer. Returns Bool. def isPandas(df): return isinstance(df, pd.DataFrame) or isinstance(df, gpd.GeoDataFrame) or isinstance(df, tuple) # a1. Used by Merge Lib. Returns valid (df, column) or (df, False) or (False, False). def getAndCheck(url, col='geometry', interactive=False): df = Intake.getData(url, interactive) # Returns False or df if ( not Intake.isPandas(df) ): if(interactive): print('No data was retrieved.', df) return False, False if (isinstance(col, list)): for colm in col: if not Intake.getAndCheckColumn(df, colm): if(interactive): print('Exiting. Error on the column: ', colm) return df, False newcol = Intake.getAndCheckColumn(df, col, interactive) # Returns False or col if (not newcol): if(interactive): print('Exiting. Error on the column: ', col) return df, col return df, newcol # a2. Returns Bool def checkColumn(dataset, column): return {column}.issubset(dataset.columns) # b1. Used by Merge Lib. Returns Both Datasets and Coerce Status def coerce(ds1, ds2, col1, col2, interactive): ds1, ldt, lIsNum = Intake.getdTypeAndFillNum(ds1, col1, interactive) ds2, rdt, rIsNum = Intake.getdTypeAndFillNum(ds2, col2, interactive) ds2 = Intake.coerceDtypes(lIsNum, rdt, ds2, col2, interactive) ds1 = Intake.coerceDtypes(rIsNum, ldt, ds1, col1, interactive) # Return the data and the coerce status return ds1, ds2, (ds1[col1].dtype == ds2[col2].dtype) # b2. Used by Merge Lib. fills na with crazy number def getdTypeAndFillNum(ds, col, interactive): dt = ds[col].dtype isNum = dt == 'float64' or dt == 'int64' if isNum: ds[col] = ds[col].fillna(-1321321321321325) return ds, dt, isNum # b3. Used by Merge Lib. def coerceDtypes(isNum, dt, ds, col, interactive): if isNum and dt == 'object': if(interactive): print('Converting Key from Object to Int' ) ds[col] = pd.to_numeric(ds[col], errors='coerce') if interactive: print('Converting Key from Int to Float' ) ds[col] = ds[col].astype(float) return ds # a3. Returns False or col. Interactive calls self def getAndCheckColumn(df, col, interactive): if Intake.checkColumn(df, col) : return col if (not interactive): return False else: print("Invalid column given: ", col); print(df.columns); print("Please enter a new column fom the list above."); col = input("Column Name: " ) return Intake.getAndCheckColumn(df, col, interactive);
df = geoms.readInGeometryData(url=url, porg=False, geom='geometry', lat=False, lng=False, revgeocode=False, save=False, in_crs=2248, out_crs=False) u = Intake rdf = Intake.getData(url)
Here we can save the data so that it may be used in later tutorials.
OBJECTID | CSA2010 | hhchpov14 | hhchpov15 | hhchpov16 | hhchpov17 | hhchpov18 | hhchpov19 | CSA2020 | hhchpov20 | hhchpov21 | Shape__Area | Shape__Length | geometry | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0 | 1 | Allendale/Irving... | 41.55 | 38.93 | 34.73 | 32.77 | 35.27 | 32.60 | Allendale/Irving... | 21.42 | 21.42 | 6.38e+07 | 38770.17 | POLYGON ((-76.65... |
1 | 2 | Beechfield/Ten H... | 22.31 | 19.42 | 21.22 | 23.92 | 21.90 | 15.38 | Beechfield/Ten H... | 14.77 | 14.77 | 4.79e+07 | 37524.95 | POLYGON ((-76.69... |
2 | 3 | Belair-Edison | 36.93 | 36.88 | 36.13 | 34.56 | 39.74 | 41.04 | Belair-Edison | 31.76 | 31.76 | 4.50e+07 | 31307.31 | POLYGON ((-76.56... |
# .to_csv(string+'.csv', encoding="utf-8", index=False, quoting=csv.QUOTE_ALL)
Download data by:
- Clicking the 'Files' tab in the left hand menu of this screen. Locate your file within the file explorer that appears directly under the 'Files' tab button once clicked. Right click the file in the file explorer and select the 'download' option from the dropdown.
You can upload this data into the next tutorial in one of two ways.
- Uploading the saved file to google Drive and connecting to your drive path
OR.
- 'by first downloading the dataset as directed above, and then navigating to the next tutorial. Go to their page and then uploading data using an file 'upload' button accessible within the 'Files' tab in the left hand menu of this screen. The next tutorial will teach you how to load this data so that it may be mapped.
Here are some examples:
Using Esri and the Geoms handler directly:
geoloom_gdf_url = "https://services1.arcgis.com/mVFRs7NF4iFitgbY/ArcGIS/rest/services/Geoloom_Crowd/FeatureServer/0/query?where=1%3D1&outFields=*&returnGeometry=true&f=pgeojson" geoloom_gdf = dataplay.geoms.readInGeometryData(url=geoloom_gdf_url, porg=False, geom='geometry', lat=False, lng=False, revgeocode=False, save=False, in_crs=4326, out_crs=False) geoloom_gdf = geoloom_gdf.dropna(subset=['geometry']) geoloom_gdf.head(1)
OBJECTID | Data_type | Attach | ProjNm | Descript | Location | URL | Name | PhEmail | Comments | POINT_X | POINT_Y | GlobalID | geometry | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0 | 1 | Artists & Resources | NaN | Joe | Test | 123 Market Pl, B... | -8.53e+06 | 4.76e+06 | e59b4931-e0c8-4d... | POINT (-76.60661... |
Again but with the Intake class:
Geoloom_Crowd, rcol = u.getAndCheck('https://services1.arcgis.com/mVFRs7NF4iFitgbY/ArcGIS/rest/services/Geoloom_Crowd/FeatureServer/0/query?where=1%3D1&outFields=*&returnGeometry=true&f=pgeojson') Geoloom_Crowd.head(1)
This getAndCheck function is useful for checking for a required field.
Hhpov = Hhpov[['CSA2010', 'hhpov15', 'hhpov16', 'hhpov17', 'hhpov18', 'hhpov19']] # Hhpov.to_csv('Hhpov.csv')
We could also retrieve from a file.
# rdf = u.getData('Hhpov.csv') rdf.head()
OBJECTID | CSA2010 | hhchpov14 | hhchpov15 | hhchpov16 | hhchpov17 | hhchpov18 | hhchpov19 | CSA2020 | hhchpov20 | hhchpov21 | Shape__Area | Shape__Length | geometry | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0 | 1 | Allendale/Irving... | 41.55 | 38.93 | 34.73 | 32.77 | 35.27 | 32.60 | Allendale/Irving... | 21.42 | 21.42 | 6.38e+07 | 38770.17 | POLYGON ((-76.65... |
1 | 2 | Beechfield/Ten H... | 22.31 | 19.42 | 21.22 | 23.92 | 21.90 | 15.38 | Beechfield/Ten H... | 14.77 | 14.77 | 4.79e+07 | 37524.95 | POLYGON ((-76.69... |
2 | 3 | Belair-Edison | 36.93 | 36.88 | 36.13 | 34.56 | 39.74 | 41.04 | Belair-Edison | 31.76 | 31.76 | 4.50e+07 | 31307.31 | POLYGON ((-76.56... |
3 | 4 | Brooklyn/Curtis ... | 46.94 | 45.01 | 46.45 | 46.41 | 39.89 | 41.39 | Brooklyn/Curtis ... | 51.32 | 51.32 | 1.76e+08 | 150987.70 | MULTIPOLYGON (((... |
4 | 5 | Canton | 6.52 | 5.49 | 2.99 | 4.02 | 4.61 | 4.83 | Canton | 4.13 | 4.13 | 1.54e+07 | 23338.61 | POLYGON ((-76.57... |