Using FastAPI and LangChain, build an API that accepts a question about company HR policy. The backend will use a pre-defined text document as its context and call a cloud LLM API (like Azure OpenAI) to answer the question based only on that context.
Store the LLM API key securely. Do not hardcode it. The application must be configured to read the key from an environment variable.
Write a Dockerfile to containerize the FastAPI application. Ensure it runs correctly on your local machine with Docker.
Build the Docker image and push it to a container registry like Docker Hub or Google Container Registry.
Deploy the container image to a PaaS provider (e.g., Google Cloud Run). Configure the required environment variables (for the API key) in the cloud platform's interface.