No description
- JavaScript 38.9%
- Rust 37.4%
- HTML 22.5%
- CSS 1.2%
| media | ||
| migrations | ||
| src | ||
| static | ||
| templates | ||
| .env | ||
| .env.example | ||
| .env.hwsite | ||
| .gitignore | ||
| Cargo.lock | ||
| Cargo.toml | ||
| README.md | ||
hwsite
A word learning web application built with Rust, Axum, and PostgreSQL. Features dictionary lookups, word lists, search history, and OpenAI GPT integration for definitions.
Prerequisites
- Rust (stable)
- PostgreSQL
Setup
1. Create the database
createdb hwsite
2. Configure environment variables
Copy the example file and edit it:
cp .env.example .env
Required variables:
| Variable | Description | Default |
|---|---|---|
DATABASE_URL |
PostgreSQL connection string | (required) |
SESSION_SECRET |
Secret key for session signing (use a random 64-char string) | default-secret-change-me-in-production-please |
OPENAI_API_KEY |
OpenAI API key for GPT word definitions | — |
MEDIA_ROOT |
Path to media file directory | ./media |
HOST |
Bind address | 0.0.0.0 |
PORT |
Listen port | 3000 |
RUST_LOG |
Log level filter | hwsite=info,tower_http=info |
Example .env:
DATABASE_URL=postgres://user:password@localhost/hwsite
SESSION_SECRET=change-me-to-a-random-64-char-string
OPENAI_API_KEY=sk-xxx
MEDIA_ROOT=/path/to/media
3. Build and run
cargo run
Database migrations run automatically on startup. The app will be available at
http://localhost:3000.
Project structure
src/
├── main.rs # Entry point, router setup
├── config.rs # Environment/config loading
├── error.rs # Error types
├── auth/
│ ├── middleware.rs # AuthUser / OptionalUser extractors
│ └── session.rs # Password hashing, flash messages
├── db/
│ ├── mod.rs # Connection pool setup
│ ├── users.rs # User queries
│ ├── words.rs # Word queries
│ ├── word_lists.rs # Word list queries
│ ├── history.rs # History queries
│ └── session_store.rs # PostgreSQL session backend
├── handlers/ # Route handlers
└── services/
├── gpt.rs # OpenAI GPT integration
├── dictcn.rs # dict.cn scraper
├── huffpost.rs # HuffPost article fetching
└── pron.rs # Pronunciation utilities
migrations/ # SQL migrations (auto-applied)
templates/ # Tera HTML templates
static/ # CSS and JS assets
Database
The app uses SQLx with automatic migrations. Two migrations are included:
- initial —
auth_user,accounts_usertoken, andtower_sessionstables - words —
words_word,words_wordsapi,words_wordct,words_example,words_wordlist,words_wordlistitem,words_history,words_articletables
The password format is compatible with Django's pbkdf2_sha256 scheme, so
existing Django user records can be used directly.
Routes
Authentication
GET/POST /accounts/login/— LoginGET /accounts/logout/— Logout
Words
GET /w/— Home pageGET /w/x/— Dictionary lookupGET /w/word/{text}/— Word detailsPOST /w/lookup/— AJAX word search (JSON)GET /w/wordsapi/{word}/— WordsAPI definitions (JSON)POST /w/counter/— Word counter tool
Word lists
GET/POST /w/create-word-list/— Create word listGET/POST /w/edit-word-list/{id}/— Edit word listGET /w/word-list/{id}/— Run/practice word listGET /w/quick-list/{id}/— Quick viewGET /w/demo/— Demo list
History & misc
GET /w/history/— Search historyPOST /w/history/delete/— Delete history itemsGET /w/news/— Latest newsGET /w/api/huffpost/— HuffPost API