Hi! We are BNIA-JFI.
This package was made to help with data handling
Included
- Functions built and used by BNIA for day to day tasks.
- Made to be shared via IPYNB/ Google Colab notebooks with in-built examples using 100% publicly accessible data & links.
- Online documentation and PyPi libraries created from the notebooks.
Dataplay uses functions found in our VitalSigns Module and vice-versa.
Create maps, networks graphs, and gifs!
You use can use these docs to learn from or as documentation when using the attached library.
TIPS
Content covered in previous tutorials will be used in later tutorials.
New code and or information should have explanations and or descriptions attached.
Concepts or code covered in previous tutorials will be used without being explaining in entirety.
If content can not be found in the current tutorial and is not covered in previous tutorials, please let me know.
This notebook has been optimized for Google Colabs ran on a Chrome Browser.
Statements found in the index page on view expressed, responsibility, errors and ommissions, use at risk, and licensing extend throughout the tutorial.
The code is on PyPI so you can install the scripts as a python library using the command:
!pip install dataplay geopandas
Their modules will be retrieved from the VitalSigns-GDrive repo they have mounted into their Colabs Enviornment.
Then...
1) Import the installed module into your code:
from VitalSigns.acsDownload import retrieve_acs_data
2) use it
retrieve_acs_data(state, county, tract, tableId, year, saveAcs)
Now you could do something like merge it to another dataset!
from dataplay.merge import mergeDatasets
mergeDatasets(left_ds=False, right_ds=False, crosswalk_ds=False, use_crosswalk = True, left_col=False, right_col=False, crosswalk_left_col = False, crosswalk_right_col = False, merge_how=False, interactive=True)
You can get information on the package, modules, and methods by using the help command.
Here we look at the package's modules:
import dataplay
help(dataplay)
Lets take a look at what functions the geoms module provides:
import dataplay.geoms
help(dataplay.geoms)
And here we can look at an individual function and what it expects:
import VitalSigns.acsDownload
help(VitalSigns.acsDownload.retrieve_acs_data)
So heres an example:
Import your modules
%%capture
import pandas as pd
from VitalSigns.acsDownload import retrieve_acs_data
from dataplay.geoms import workWithGeometryData
from dataplay.geoms import map_points
from dataplay.intaker import Intake
Read in some data
Define our download parameters.
More information on these parameters can be found in the tutorials!
tract = '*'
county = '510'
state = '24'
tableId = 'B19001'
year = '17'
saveAcs = False
And download the Baltimore City ACS data using the imported VitalSigns library.
df = retrieve_acs_data(state, county, tract, tableId, year, saveAcs)
Here we can import and display a geospatial dataset with special intake requirements.
Here we pull a list of Baltimore Cities CSA's
help(csa_gdf.plot)
Now in this example we will load in a bunch of coorinates
geoloom_gdf_url = "https://services1.arcgis.com/mVFRs7NF4iFitgbY/ArcGIS/rest/services/Geoloom_Crowd/FeatureServer/0/query?where=1%3D1&outFields=*&returnGeometry=true&f=pgeojson"
geoloom_gdf = dataplay.geoms.readInGeometryData(url=geoloom_gdf_url, porg=False, geom='geometry', lat=False, lng=False, revgeocode=False, save=False, in_crs=4326, out_crs=False)
geoloom_gdf = geoloom_gdf.dropna(subset=['geometry'])
# geoloom_gdf = geoloom_gdf.drop(columns=['POINT_X','POINT_Y'])
geoloom_gdf.head(1)
And here we get the number of points in each of our corresponding CSAs (polygons)
geoloom_w_csas = dataplay.geoms.workWithGeometryData(method='pinp', df=geoloom_gdf, polys=csa_gdf, ptsCoordCol='geometry', polygonsCoordCol='geometry', polyColorCol='hhchpov18', polygonsLabel='CSA2010', pntsClr='red', polysClr='white')
And we plot it with a legend
geoloom_w_csas.plot( column='pointsinpolygon', legend=True)
What were to happen if I wanted to create a interactive click map with the label of each csa (polygon) on each point?
Well we just run the reverse operation!
geoloom_w_csas = workWithGeometryData(method='ponp', df=geoloom_gdf, polys=csa_gdf, ptsCoordCol='geometry', polygonsCoordCol='geometry', polyColorCol='hhchpov18', polygonsLabel='CSA2010', pntsClr='red', polysClr='white')
And then we can visualize it like:
outp = map_points(geoloom_w_csas, lat_col='POINT_Y', lon_col='POINT_X', zoom_start=12, plot_points=True, cluster_points=False,
pt_radius=1, draw_heatmap=True, heat_map_weights_col=None, heat_map_weights_normalize=True,
heat_map_radius=15, popup='CSA2010')
These interactive visualizations can be exported to html using a tool found later in this document.
Its how I made this page!
If you like what you see, there is more in the package you will just have to explore.
Have Fun!
Disclaimer
Views Expressed: All views expressed in this tutorial are the authors own and do not represent the opinions of any entity whatsover with which they have been, are now, or will be affiliated.
Responsibility, Errors and Ommissions: The author makes no assurance about the reliability of the information. The author makes takes no responsibility for updating the tutorial nor maintaining it porformant status. Under no circumstances shall the Author or its affiliates be liable for any indirect incedental, consequential, or special and or exemplary damages arising out of or in connection with this tutorial. Information is provided 'as is' with distinct plausability of errors and ommitions. Information found within the contents is attached with an MIT license. Please refer to the License for more information.
Use at Risk: Any action you take upon the information on this Tutorial is strictly at your own risk, and the author will not be liable for any losses and damages in connection with the use of this tutorial and subsequent products.
Fair Use this site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. While no intention is made to unlawfully use copyrighted work, circumstanes may arise in which such material is made available in effort to advance scientific literacy. We believe this constitutes a 'fair use' of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Titile 17 U.S.C. Section 108, the material on this tutorial is distributed without profit to those who have expressed a prior interest in receiving the included information for research and education purposes.
for more information go to: http://www.law.cornell.edu/uscode/17/107.shtml. If you wish to use copyrighted material from this site for purposes of your own that go beyond 'fair use', you must obtain permission from the copyright owner.
License
Copyright © 2019 BNIA-JFI
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
From a local copy of the git repo:
- Clone the repo local onto GDrive
- Via Direct-DL&Drive-Upload or Colab/Terminal/Git
git clone https://github.com/BNIA/dataplay.git
- Update the the IPYNB
- From the GDrive dataplay folder via Colabs
- Build the new libraries from these NBs
- Using this index.ipynb
- Mount the Colabs Enviornment (and navigate to) the GDrive dataplay folder
- run
!nbdev_build_lib
to build .py modules.
- run
- Test the Library/ Modules
- Using the same runtime as step 2's index.ipynb.
- Do not install the module from PyPi (if published) and then...
- Import your module ( from your dataplay/dataplay)
- If everything runs properly, go to step 5.
- Edit modules directly
- Within the same runtime as step 2/3's index.ipynb...
- Locate the dataplay/dataplay using the colab file nav
- double-click the .py modules in the file nav to open them in an in-browser editor
- Make changes and return to step 3 with the following caveat:
- Use the hot module reloading to ensure updates are auto-re-imported
%load_ext autoreload %autoreload 2
- Then when finished, persist the changes from the .py modules back to the .ipynb docs
- via
!nbdev_update_lib
and!relimport2name
- via
- Create Docs, Push to Github, and Publish to PyPI
# settings.ini > requirements = fastcore>=1.0.5 torchvision<0.7
# https://nbdev.fast.ai/tutorial.html#View-docs-locally
# console_scripts = nbdev_build_lib=nbdev.cli:nbdev_build_lib
# https://nbdev.fast.ai/search
Not shown.
!nbdev_build_lib
!pip install geopandas
!pip install VitalSigns
!pip install marko
!nbdev_build_docs --force_all True --mk_readme True
!nbdev_nb2md 'notebooks/index.ipynb' > README.md