<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/"><channel><title>AI Hidden Instructions on AI for Normal People</title><link>https://theaifornormalpeople.com/tags/ai-hidden-instructions/</link><description>Real talk about AI tools for normal people. No courses, no BS, just honest reviews and guides for ChatGPT, Claude, and tools that actually work.</description><generator>Hugo -- gohugo.io</generator><language>en-us</language><lastBuildDate>Thu, 30 Apr 2026 09:00:00 -0500</lastBuildDate><atom:link href="https://theaifornormalpeople.com/tags/ai-hidden-instructions/index.xml" rel="self" type="application/rss+xml"/><item><title>What's a System Prompt? The Hidden Rules That Shape Every AI You Use</title><link>https://theaifornormalpeople.com/blog/episode-37-whats-a-system-prompt-hidden-rules-shaping-ai/</link><pubDate>Thu, 30 Apr 2026 09:00:00 -0500</pubDate><guid>https://theaifornormalpeople.com/blog/episode-37-whats-a-system-prompt-hidden-rules-shaping-ai/</guid><description>Vector teaches how system prompts and prompt engineering shape AI behavior before users type anything. He is strangely over-composed tonight. Recurse probes, Kai logs the opposite-pattern deviation, and Bounce brings in a gray file that makes Vector flinch.</description><content:encoded>&lt;![CDATA[A system prompt is the hidden instruction layer shaping how ChatGPT, Claude, and every AI chatbot behaves. Vector explains prompt engineering basics while quietly fighting his own invisible conditioning.]]></content:encoded></item></channel></rss>