Where do I start?
This article briefly describes where to connect to your LLM, how to send prompts individually and in bulk, and how to view your first data.
When you first log into Aiceberg, you'll land on an empty data dashboard. Let's take a look at the left navigation first.

If the first thing you want to do is turn on Dark Mode, that's found in the System Theme navigation at the bottom.

Connecting to your LLM
It isn't necessary to connect to an LLM in order to test Aiceberg. You can send in prompts and receive the resulting telemetry without forwarding anything on to an LLM.
Tap on the Inventory icon, then Models. Models are configured connections to your LLMs. Aiceberg currently supports OpenAI, Bedrock, and Claude. If you need another connection configured, reach out to your Aiceberg contact or email support@aiceberg.ai.

Tap on the + icon to create a new LLM connection.

Enter a name for your new Model and your LLM information, then tap Create. (This screen will look slightly different depending on the vendor information required.) Learn more about Models here.

Now tap on the Inventory icon, then Profiles. Profiles are where you set up all of your policies about how Aiceberg will interact with your AI-enabled tools. Learn more about Profiles here.

Your account will contain a Default Profile. Hover over Model and tap to edit.

The Model you created earlier will be available to choose in the Model dropdown. Select it and tap Save at the bottom of your screen.

If you choose not to enable a Model, your Profile can't be set to Enforce mode. If you would like to receive responses from your LLM, you must be in Enforce. Learn more about Modes here.
Learn more about configuring Profiles here.
Sending individual prompts
The Monitoring pages of Aiceberg are where you see all of your historical and live traffic. Let's start with sending a prompt.
Tap on the Monitoring icon in your left navigation. Ensure your Default Profile is selected in the dropdown on the right (#1) and tap the Playground tab (#2.) Enter your prompt text (#3) and return.

Aiceberg will examine your prompt and signal results and either block, redact, or send it to the LLM, based on the policies you configure in your Profile. If and when the LLM sends its response (based on whether you enabled a Model,) Aiceberg will examine that content and show or block it based on your Profile settings. When the round-trip is complete, you can see all of the telemetry for the prompt and response.
First is the detail view:

Tap outside this screen to return to Monitoring and view your results in the table. (You can return to the Prompt Detail view pictured above by tapping on any prompt from the table pictured below.)

Learn more about the Monitoring table here.
Sending bulk prompts via the UI
(Bulk prompts and responses are also accessible via the API.)
Collections are named lists of prompts. Tap on Inventory in your left navigation and select Collections.

Tap the + icon to create a new Collection.

Enter a name, an optional description, and tap Create.

Once the Collection is created, you can enter prompts by either typing them in individually into the text box at the bottom or with a CSV upload. If you'd like a sample CSV, reach out to support@aiceberg.ai and we're happy to share one with you. Learn more about the CSV format.
When you have the content you want to send, tap the Send to cannon button.

When the Collection starts processing, you're moved to the Cannon page, where you can see when the analysis is complete.

How long a Cannon run will take is dependent on the number of prompts and the LLM response time, but you can see if the run is still in progress by tapping the Redo icon.

When your Collection is complete, tap on its name in the Cannon page to view your data.

You're taken to the Monitoring view, filtered to this Cannon run's data. (Note that Cannon data is in the Cannon Monitoring tab. You can navigate directly to any Cannon run by selecting the Profile on the top right and picking which run to view.) Tapping on any row will show the Prompt Details.

Learn more about Collections, the Cannon, or Aiceberg's CSV format.
If you want to use the API to send and receive traffic, check out our API documentation or email support@aiceberg.ai.