Step: Download docker desktop Step: install nginx with “pull” and “run” Key problem: where is nginx serving? “80/tcp” Step: forward host machine requests to container “0.0.0.0:80->80/tcp”
How to invert docker ps output
How Step: Create a new bash script (“.docker_ps_invert.sh) Step: allow executing Why? Alternative You can choose specific columns to show (like below)… But what if you want to see every column? That’s where the script about shines.
How to format, filter docker output
install nginx on mac with homebrew
Q: How do I install nginx? Q: Where is nginx configured? Q: Where is nginx hosted on homebrew?
Chunking text
These are basic chunking utilities for quickly getting large text blocks into smaller chunks. Starts with Character based, then Word base, then Sentence based chunking.
My VSCode Vim settings
User Settings > Key Bindings Step: Find”Preferences: Open User Settings (JSON) Explanation: Code command Do this in the command palette by typing “shell command” Key Repeat on Mac Different installations use different commands (github)
async openai calls with python
In the previous post we explored asyncio (link), here we build on that to create async openai calls The uses the standard python “asyncio” library and the function asyncio.gather to concurrently call openai 3 times (you can add as many calls as you like).
Async python with asyncio
You have two options for concurrent python Traditionally, we used threads with python for concurrency, but the release of the asyncio library (version 3.4) and the release of async/await syntax (version 3.5) created support for native coroutines that are familiar to users of the many languages that support it. (history of asyncio) asyncio (docs) The…
local inference with llama-cpp-python
Step: install the package Step: download a model “TheBloke” on huggingface (link) has a ton of models in “GGUF” format (format introduced by llama.cpp) Click on any of these different quantized (reduced precision) models and find the “download” link Put them in your project somewhere like a “models” directory Step: create your python (.py) or…
rsync with python without extra files
Your goal is to send your python project to your server from your local machine. Your python project includes a ton of files you wouldn’t want to send to a server so a simple rsync is not enough. Option 1: rsync with a bunch of exclude flags Option 2: Shell Function Option 3: Shell script