/search_fundamentals_course

Public repository for the Search Fundamentals course taught by Daniel Tunkelang and Grant Ingersoll. Available at https://corise.com/course/search-fundamentals?utm_source=daniel

Primary LanguagePythonApache License 2.0Apache-2.0

Welcome to Search Fundamentals

Search Fundamentals is a two week class taught by Grant Ingersoll and Daniel Tunkelang and is focused on helping students quickly get up to speed on search by teaching the basics of search like indexing, querying, faceting/aggregations, spell checking, typeahead autocomplete and highlighting.

The class is a hands-on project-driven course where students will work with real data and the Opensearch/Elasticsearch ecosystem.

Class code layout (e.g. where the projects are)

For our class, we have two weekly projects. Each project is a standalone Python Flask application that interacts with an OpenSearch server (and perhaps other services).

You will find these four projects in the directories below them organized in the following way:

  • Week 1:
    • week1 -- The unfinished template for the week's project, annotated with instructions.
  • Week 2:
    • week2 -- The unfinished template for the week's project, annotated with instructions.

Our instructor annotated results for each project will be provided during the class.
Please note, these represent our way of doing the assignment and may differ from your results, as there is often more than one way of doing things in search.

You will also find several supporting directories and files for Docker and Gitpod.

You can also use the included Makefile to interact with the project, including running pyenv, Flask, and re-indexing the data. Run all make commands at the root of the repo so Makefile can pick them up, i.e. /workspace/search_fundamentals_course.

Prerequisites

  1. For this class, you will need a Kaggle account and a Kaggle API token.
  2. No prior search knowledge is required, but you should be able to code in Python or Java (all examples are in Python)
  3. You will need a Gitpod account.

Working in Gitpod (Officially Supported)

  1. Fork this repository.

  2. Launch a new Gitpod workspace based on this repository. This will automatically start OpenSearch and OpenSearch Dashboards.

    1. Note: it can take a few minutes for OpenSearch and the dashboards to launch.
  3. You should now have a running Opensearch instance (port 9200) and a running Opensearch Dashboards instance (port 5601)

  4. Login to the dashboards at https://5601-<$GITPOD_URL>/ with default username admin and password admin. This should popup automatically as a new tab, unless you have blocked popups. Also note, that in the real world, you would change your password. Since these ports are blocked if you aren't logged into Gitpod, it's OK.

     $GITPOD_URL is a placeholder for your ephemeral Gitpod host name, e.g. silver-grasshopper-8czadqyn.ws-us25.gitpod.io     
    

Downloading the Best Buy Dataset

  1. Run the install Kaggle API token script and follow the instructions:
./install-kaggle-token.sh
  1. Accept all of the kaggle competition rules then run the download data script:

Run (or make download)

./download-data.sh 
  1. Verify your data is in the right location:
ls /workspace/datasets
  1. You should see: popular_skus.py product_data test.csv train.csv

Setting up the Indexes

./index-data.sh

🐞 Debugging: Testing the datasets that were downloaded from Kaggle

Running: GET /bbuy_products/_count Logged this:

{
  "count": 0,
  "_shards": {
    "total": 1,
    "successful": 1,
    "skipped": 0,
    "failed": 0
  }
}

Running: GET /bbuy_queries/_count Logged this:

{
  "count": 5595807,
  "_shards": {
    "total": 1,
    "successful": 1,
    "skipped": 0,
    "failed": 0
  }
}

To do a reset, I deleted the indexes:

DELETE /bbuy_products
DELETE /bbuy_queries

Exploring the OpenSearch Sample Dashboards and Data

  1. Login to OpenSearch and point your browser at https://5601-<$GITPOD_URL>/app/opensearch_dashboards_overview#/
  2. Click the "Add sample data" link
  3. Click the "Add Data" link for any of the 3 projects listed. In the class, we chose the "Sample flight data", but any of the three are fine for exploration.

Running the Weekly Project

At the command line, do the following steps to run the example.

  1. Activate your Python Virtual Environment. We use pyenv Pyenv website and pyenv-virtualenv Pyenv Virtualenv, but you can use whatever you are most comfortable with. A. pyenv activate search_fundamentals -- Activate that Virtualenv.
  2. Run Flask: run make week1 or make week2 depending on which week you need A. Open the Flask APP at https://3000-<$GITPOD_URL>/ (or whatever port you choose)
  3. Or run ipython

Working locally (Not supported, but may work for you. YMMV)

To run locally, you will need a few things:

  1. Pyenv and Pyenv-Virtualenv with Python 3.9.7 installed
  2. Docker
  3. A Git client

Note: these have only been tested on a Mac running OS 12.2.1. YMMV. Much of what you will need to do will be similar to what's in .gitpod.Dockerfile

  1. pyenv install 3.9.7
  2. pip install all of the libraries you see in .gitpod.Dockerfile
  3. Setup your weekly python environments per the "Weekly Project" above.
  4. Run OpenSearch: A. cd docker B. docker-compose up
  5. Note: most of the scripts and projects assume the data is in /workspace/datasets, but have overrides to specify your own directories. You will need to download and plan accordingly.
  6. Do your work per the Weekly Project