Tag: digital

  • Thrift Label

    Thrift Label

    Thrift Label is a search engine aggregator for second-hand e-commerce listings.

    Building off the idea of Futureshop, Thrift Label is a search aggregator that combines different marketplaces, but just the second-hand listings.

    By combining the search results for many big and small e-commerce sites into a single search experience, Thrift Label makes it easier to find great second-hand deals on just what you are looking for.

    Mega-marketplaces such as eBay and Amazon ensure that even long-tail searches will often return relevant results.

    I expect lots more to come for Thrift Label!

    Tools:

    • Django
    • MongoDB Atlas Search
    • Neon Postgres
    • Digital Ocean App Platform
    • Bootstrap
    • Scrapy

    link: https://thriftlabel.com

  • NYC Open Data Explorer

    NYC Open Data Explorer

    A data explorer and visualizer built off the NYC Open Data portal.

    NYC Open Data is an expansive portal of over 3000 datasets, files, maps, and more. I built a portal to quickly search, sort, and skim the datasets, and open them up for basic exploration.

    This project is a little bit unique in that my goal was not do dive deeply into a specific dataset, rather it was to cleanly and intelligently display any of the datasets.

    To start, I had to scrape the metadata from all 3000+ datasets. After first using Playwright to render the javascript, I found a method that didn’t require javascript, and instead parse the text with regular expressions.

    I chose Streamlit as the visualization package, largely because it looked cool and feature-rich.

    For the viewer page, I wanted the page to be able to detect columns that were latitude and longitude, since the datasets do not have consistent column names. With those columns detected, I could map any datasets that contained a lat+long.

    There are many features that I’d still like to add, especially around the handling of other types of datasets, such as “map”, which is sometimes geojson and sometimes csv, and thus a bit more complex to process.

    Project requirements:

    • Scrape meta-data from datasets
    • View, display, and sort meta-data
    • Viewer to load and display dataset data
    • Automatic recogniction of lat/long columns, conversion to numeric formats, and display on map.

    Tools:

    • Scrapy
    • Streamlit
    • Python (pandas, requests)

    link: https://nycopendataexplorer.streamlit.app/

  • Futureshop

    Futureshop

    Futureshop is a search marketplace for clothing made more sustainably, with lower environmental footprint.

    I’ve always been a very “conscious consumer”, but I also am acutely aware of the extra effort it takes. I don’t mind the effort, of course, but I know lots of people who would love to shop more “sustainable” if it were just at least almost as convenient as otherwise.

    The thesis behind Futureshop is that if customers are given a broad search experience of pre-qualified products, they will use it.

    Initial prototyping was built on the Bubble no-code system, before switching to Django for extensibility, control, and scale.

    The Django front-end accesses a MongoDB database using Atlas Search. The database is populated by a custom-built Scrapy scraping platform.

    Tools:

    • Django
    • Bootstrap
    • MongoDB
    • Scrapy
    • Digital Ocean
    • PlanetScale MySQL

    link: https://futureshop.co

  • Integrated Data System with Dashboard

    Integrated Data System with Dashboard

    Full-stack data processing system, from intake forms, to data-cleaning, to reporting and dashboards.

    I was tasked with designing and implemented a full-scope data system for an inspection program with multiple different constituencies.

    The solution had many constraints, including the requirement to work seamlessly with a loosely affiliated network of consultants and members.

    The data-intake form was built on top of Excel, as it was a tool that everyone was familiar with. Extensive controls and guidelines were built into the Excel form. Then a data-ingestion pipeline was built in python to clean and review data the data and put it into an SQL database.

    From there, numerous python scripts were developed to manage the intake forms, report on the data, and manage the program.

    In parallel, another SQL database was developed to host an inspection management and scheduling platform, from which a front-end database portal app was built using the app-builder DronaHQ.

    Requirements:

    • data intake
    • data quality assurance
    • data warehousing
    • reporting
    • live data dashboards

    Tools:

    • Python (pandas, openpyxl, sqlalchemy)
    • MySQL
    • BOX.com
    • Excel
    • DronaHQ

  • Inspection Project Management System

    Inspection Project Management System

    A regional company involved in inspections and contracting was buckling under the pressure of a 100% increase in project volume within 12 months.

    To that date, the existing tried-and-true program management methods of legal pads, filing cabinets, and paper charts had been working just fine. But with the increased volume, the pressure and anxiety had jumped up.

    Working with the whole team, I led the development of a project management system based on Salesforce. In discovery/design, we investigated custom programmed solutions, and an app built on Salesforce was projected to be less expensive to build and quicker to iterate.

    I knew that the new system would be a dramatic process change, so the project involved an extensive training and onboarding phase to bring all team members into the new system, and successfully train them on the new methods.