Nigeria No1. Music site And Complete Entertainment portal for Music Promotion WhatsApp:- +2349077287056
Sunday, 1 February 2026
Show HN: Is AI "good" yet? – tracking HN sentiment on AI coding https://bit.ly/4qfxfhU
Show HN: Is AI "good" yet? – tracking HN sentiment on AI coding A survey tracking developer sentiment on AI-assisted coding through Hacker News posts. https://bit.ly/4q7Kp0j February 2, 2026 at 03:06AM
Show HN: Wikipedia as a doomscrollable social media feed https://bit.ly/3Oj4jbm
Show HN: Wikipedia as a doomscrollable social media feed https://bit.ly/4rj1aXw February 2, 2026 at 01:12AM
Show HN: NanoClaw – “Clawdbot” in 500 lines of TS with Apple container isolation https://bit.ly/4qau5fm
Show HN: NanoClaw – “Clawdbot” in 500 lines of TS with Apple container isolation I’ve been running Clawdbot for the last couple weeks and have genuinely found it useful but running it scares the crap out of me. OpenClaw has 52+ modules and runs agents with near-unlimited permissions in a single Node process. NanoClaw is ~500 lines of core code, agents run in actual Apple containers with filesystem isolation. Each chat gets its own sandboxed context. This is not a swiss army knife. It’s built to match my exact needs. Fork it and make it yours. https://bit.ly/4qTY7oY February 1, 2026 at 11:49PM
Saturday, 31 January 2026
Show HN: Peptide calculators ask the wrong question. I built a better one https://bit.ly/4r36xtW
Show HN: Peptide calculators ask the wrong question. I built a better one Most peptide calculators ask the wrong question. They ask: How much water are you adding? But in practice, what you actually know is your vial size and your target dose . The water amount should be the output , not the input . It should also make your dose land on a real syringe tick mark. Not something like 17.3 units. I built a peptide calculator that works this way: https://bit.ly/4r36y0Y What’s different: - You pick vial size and target dose → reconstitution is calculated for you - Doses align to actual syringe markings - Common dose presets per peptide - Works well on mobile (where this is usually done) - Supports blends and compounds (e.g. GLOW or CJC-1295 + Ipamorelin) - You can save your vials. No account required. Happy to hear feedback or edge cases worth supporting. https://bit.ly/4r36y0Y February 1, 2026 at 03:02AM
Show HN: I built a receipt processor for Paperless-ngx https://bit.ly/4tcdRFa
Show HN: I built a receipt processor for Paperless-ngx Hi all, I wanted a robust way to keep track of my receipts without needing to keep them in a box and so i found paperless - but the existing paperless ai projects didn't really convert my receipts to usable data. so I created a fork of nutlope's receipthero (actually it's a complete rewrite, the only thing that remains over is the system prompt) The goal of this project is to be a one stop shop for automatically detecting tagged docs and converting them to json using schema definitions - that includes invoices, .... i can't think of any others right now, maybe you can? If you do please make an issue for it! I would appreciate any feedback/issues thanks! (p.s i made sure its simple to setup with dockge/basic docker-compose.yml) repo: https://bit.ly/4a61i5v tutorial: https://youtu.be/LNlUDtD3og0 February 1, 2026 at 01:17AM
Show HN: An Open Source Alternative to Vercel/Render/Netlify https://bit.ly/49Q4u6C
Show HN: An Open Source Alternative to Vercel/Render/Netlify https://bit.ly/4kdJp9B February 1, 2026 at 01:40AM
Friday, 30 January 2026
Show HN: Foundry – Turns your repeated workflows into one-click commands https://bit.ly/4a3z77o
Show HN: Foundry – Turns your repeated workflows into one-click commands https://bit.ly/4a54r5L January 31, 2026 at 01:40AM
Show HN: Using World Models for Consistent AI Filmmaking https://bit.ly/4ab3mcH
Show HN: Using World Models for Consistent AI Filmmaking https://bit.ly/4a7AWjS January 30, 2026 at 10:41PM
Thursday, 29 January 2026
Show HN: Mystral Native – Run JavaScript games natively with WebGPU (no browser) https://bit.ly/4amiBRb
Show HN: Mystral Native – Run JavaScript games natively with WebGPU (no browser) Hi HN, I've been building Mystral Native — a lightweight native runtime that lets you write games in JavaScript/TypeScript using standard Web APIs (WebGPU, Canvas 2D, Web Audio, fetch) and run them as standalone desktop apps. Think "Electron for games" but without Chromium. Or a JS runtime like Node, Deno, or Bun but optimized for WebGPU (and bundling a window / event system using SDL3). Why: I originally started by starting a new game engine in WebGPU, and I loved the iteration loop of writing Typescript & instantly seeing the changes in the browser with hot reloading. After getting something working and shipping a demo, I realized that shipping a whole browser doesn't really work if I also want the same codebase to work on mobile. Sure, I could use a webview, but that's not always a good or consistent experience for users - there are nuances with Safari on iOS supporting WebGPU, but not the same features that Chrome does on desktop. What I really wanted was a WebGPU runtime that is consistent & works on any platform. I was inspired by deno's --unsafe-webgpu flag, but I realized that deno probably wouldn't be a good fit long term because it doesn't support iOS or Android & doesn't bundle a window / event system (they have "bring your own window", but that means writing a lot of custom code for events, dealing with windowing, not to mention more specific things like implementing a WebAudio shim, etc.). So that got me down the path of building a native runtime specifically for games & that's Mystral Native. So now with Mystral Native, I can have the same developer experience (write JS, use shaders in WGSL, call requestAnimationFrame) but get a real native binary I can ship to players on any platform without requiring a webview or a browser. No 200MB Chromium runtime, no CEF overhead, just the game code and a ~25MB runtime. What it does: - Full WebGPU via Dawn (Chrome's implementation) or wgpu-native (Rust) - Native window & events via SDL3 - Canvas 2D support (Skia), Web Audio (SDL3), fetch (file/http/https) - V8 for JS (same engine as Chrome/Node), also supports QuickJS and JSC - ES modules, TypeScript via SWC - Compile to single binary (think "pkg"): `mystral compile game.js --include assets -o my-game` - macOS .app bundles with code signing, Linux/Windows standalone executables - Embedding API for iOS and Android (JSC/QuickJS + wgpu-native) It's early alpha — the core rendering path works well & I've tested on Mac, Linux (Ubuntu 24.04), and Windows 11, and some custom builds for iOS & Android to validate that they can work, but there's plenty to improve. Would love to get some feedback and see where it can go! MIT licensed. Repo: https://bit.ly/4rmOWx5 Docs: https://bit.ly/46oiPVx https://bit.ly/4rmOWx5 January 27, 2026 at 07:33PM
Show HN: Free Facebook Video Downloader with Original Audio Quality https://bit.ly/3NL23cV
Show HN: Free Facebook Video Downloader with Original Audio Quality A free, web-based Facebook video downloader that actually preserves the original audio - something most Facebook downloaders fail to do. Built with Next.js and yt-dlp, it offers a clean, no-ads experience for downloading Facebook videos in multiple qualities. https://bit.ly/4t8AgDo January 30, 2026 at 03:22AM
Show HN: Play Zener Cards https://bit.ly/4rkUJTN
Show HN: Play Zener Cards just play zener cards. don't judge :) https://bit.ly/4rhsS6P January 30, 2026 at 01:39AM
Wednesday, 28 January 2026
Show HN: Codex.nvim – Codex inside Neovim (no API key required) https://bit.ly/4aq7cin
Show HN: Codex.nvim – Codex inside Neovim (no API key required) Hi HN! I built codex.nvim, an IDE-style Neovim integration for Codex. Highlights: - Works with OpenAI Codex plans (no API key required) - Fully integrated in Neovim (embedded terminal workflow) - Bottom-right status indicator shows busy/wait state - Send selections or file tree context to Codex quickly Repo: https://bit.ly/46kNNhf Why I built this: I wanted to use Codex comfortably inside Neovim without relying on the API. Happy to hear feedback and ideas! https://bit.ly/46kNNhf January 29, 2026 at 07:17AM
Show HN: Shelvy Books https://bit.ly/4aivwDI
Show HN: Shelvy Books Hey HN! I built a little side project I wanted to share. Shelvy is a free, visual bookshelf app where you can organize books you're reading, want to read, or have finished. Sign in to save your own collection. Not monetized, no ads, no tracking beyond basic auth. Just a fun weekend project that grew a bit. Live: https://bit.ly/45yNLSL Would love any feedback on the UX or feature ideas! https://bit.ly/45yNLSL January 29, 2026 at 02:16AM
Show HN: Drum machine VST made with React/C++ https://bit.ly/45FQ6eK
Show HN: Drum machine VST made with React/C++ Hi HN! We just launched our drum machine vst this month! We will be updating it with many new synthesis models and unique features. Check it out, join our discord and show us what you made! https://bit.ly/49YmzOv January 27, 2026 at 06:03AM
Show HN: Frame – Managing projects, tasks, and context for Claude Code https://bit.ly/4rcuAqe
Show HN: Frame – Managing projects, tasks, and context for Claude Code I built Frame to better manage the projects I develop with Claude Code, to bring a standard to my Claude Code projects, to improve project and task planning, and to reduce context and memory loss. In its current state, Frame works entirely locally. You don’t need to enter any API keys or anything like that. You can run Claude Code directly using the terminal inside Frame. Why am I not using existing IDEs? Simply because, for me, I no longer need them. What I need is an interface centered around the terminal, not a code editor. I initially built something that allowed me to place terminals in a grid layout, but then I decided to take it further. I realized I also needed to manage my projects and preserve context. I’m still at a very early stage, but even being able to build the initial pieces I had in mind within 5–6 days—using Claude Code itself—feels kind of crazy. What can you do with Frame? You can start a brand-new project or turn an existing one into a Frame project. For this, Frame creates a set of Markdown and JSON files with rules I defined. These files exist mainly to manage tasks and preserve context. You can manually add project-related tasks through the UI. I haven’t had the chance to test very complex or long-running scenarios yet, but from what I’ve seen, Claude Code often asks questions like: “Should I add this as a task to tasks.json?” or “Should we update project_notes.md after this project decision?” I recommend saying yes to these. I also created a JSON file that keeps track of the project structure, down to function-level details. This part is still very raw. In the future, I plan to experiment with different data structures to help AI understand the project more quickly and effectively. As mentioned, you can open your terminals in either a grid or tab view. I added options up to a 3×3 grid. Since the project is open source, you can modify it based on your own needs. I also added a panel where you can view and manage plugins. For code files or other files, I included a very simple editor. This part is intentionally minimal and quite basic for now. Based on my own testing, I haven’t encountered any major bugs, but there might be some. I apologize in advance if you run into any issues. My core goal is to establish a standard for AI-assisted projects and make them easier to manage. I’m very open to your ideas, support, and feedback. You can see more details on GitHub : https://bit.ly/4bpLWva January 29, 2026 at 12:04AM
Tuesday, 27 January 2026
Show HN: How would you decide famous SCOTUS cases? https://bit.ly/4rht464
Show HN: How would you decide famous SCOTUS cases? https://bit.ly/4ri8xOU January 28, 2026 at 03:26AM
Show HN: Fuzzy Studio – Apply live effects to videos/camera https://bit.ly/4bnCybq
Show HN: Fuzzy Studio – Apply live effects to videos/camera Back story: I've been learning computer graphics on the side for several years now and gain so much joy from smooshing and stretching images/videos. I hope you can get a little joy as well with Fuzzy Studio! Try applying effects to your camera! My housemates and I have giggled so much making faces with weird effects! Nothing gets sent to the server; everything is done in the browser! Amazing what we can do. I've only tested on macOS... apologies if your browser/OS is not supported (yet). https://bit.ly/3LBeE1K January 27, 2026 at 04:16PM
Show HN: ACME Proxy using step-ca https://bit.ly/3NAOQ6v
Show HN: ACME Proxy using step-ca https://bit.ly/4k15F6u January 27, 2026 at 11:12PM
Monday, 26 January 2026
Show HN: A Local OS for LLMs. MIT License. Zero Hallucinations. (Not Crank) https://bit.ly/4rhe9cd
Show HN: A Local OS for LLMs. MIT License. Zero Hallucinations. (Not Crank) The problem with LLMs isn't intelligence; it's amnesia and dishonesty. Hey HN, I’ve spent the last few months building Remember-Me, an open-source "Sovereign Brain" stack designed to run entirely offline on consumer hardware. The core thesis is simple: Don't rent your cognition. Most RAG (Retrieval Augmented Generation) implementations are just "grep for embeddings." They are messy, imprecise, and prone to hallucination. I wanted to solve the "Context integrity" problem at the architectural layer. The Tech Stack (How it works): QDMA (Quantum Dream Memory Architecture): instead of a flat vector DB, it uses a hierarchical projection engine. It separates "Hot" (Recall) from "Cold" (Storage) memory, allowing for effectively infinite context window management via compression. CSNP (Context Switching Neural Protocol) - The Hallucination Killer: This is the most important part. Every memory fragment is hashed into a Merkle Chain. When the LLM retrieves context, the system cryptographically verifies the retrieval against the immutable ledger. If the hash doesn't match the chain: The retrieval is rejected. Result: The AI visually cannot "make things up" about your past because it is mathematically constrained to the ledger. Local Inference: Built on top of llama.cpp server. It runs Llama-3 (or any GGUF) locally. No API keys. No data leaving your machine. Features: Zero-Dependency: Runs on Windows/Linux with just Python and a GPU (or CPU). Visual Interface: Includes a Streamlit-based "Cognitive Interface" to visualize memory states. Open Source: MIT License. This is an attempt to give "Agency" back to the user. I believe that if we want AGI, it needs to be owned by us, not rented via an API. Repository: https://bit.ly/49BNC3c I’d love to hear your feedback on the Merkle-verification approach. Does constraining the context window effectively solve the "trust" issue for you? It's fully working - Fully tested. If you tried to Git Clone before without luck - As this is not my first Show HN on this - Feel free to try again. To everyone who HATES AI slop; Greedy corporations and having their private data stuck on cloud servers. You're welcome. Cheers, Mohamad Authors note: Updated successfully. Framework 50 is active. For anyone passing by - yes this is a big deal. Eliminating AI hallucination is a 60 billion dollar market problem and I'm giving THAT + sovereign control of your DATA plus the capability to do high-end research via framework 50 (including advanced scientific research) for FREE - under an MIT license. If you don't take advantage of this - you are an idiot. If you do - welcome to the future. P.S: What do I get from lying? I got 36 stars on the repo - many from high-end senior engineers at fortune 500 companies. If you're too stupid to tell the real deal from a lie then keep it moving son. https://bit.ly/49BNC3c January 27, 2026 at 05:56AM
Show HN: LocalPass offline password manager. Zero cloud. Zero telemetry https://bit.ly/49YyqvY
Show HN: LocalPass offline password manager. Zero cloud. Zero telemetry I’ve released LocalPass — a local‑first, offline password manager with zero cloud, zero telemetry, and zero vendor lock‑in. 100% local storage, 100% open‑source. https://bit.ly/3M0oES2 January 26, 2026 at 11:38PM
Subscribe to:
Comments (Atom)