Swiftide 0.6 - Qdrant named vector support and templated prompts

Published: at by Timon Vonk

Introducing Swiftide 0.6! This release adds named vector support for Qdrant, templated prompts and (many) documentation improvements.

To get started with Swiftide, head over to swiftide.rs or check us out on github.

Named vectors with Qdrant and Embed modes

Shoutout to @pwalski for the implementation.

In addition to embedding the combined metadata and chunk, you might also want to embed the metadata and chunk individually, or do both.

For example, the following pipeline will add vectors for each metadata field and the chunk:

indexing::Pipeline::from_loader(FileLoader::new("LICENSE"))
.with_concurrency(1)
.with_embed_mode(EmbedMode::PerField)
.then_chunk(ChunkMarkdown::from_chunk_range(20..2048))
.then(MetadataQAText::new(openai_client.clone()))
.then(MetadataTitle::new(openai_client.clone()))
.then_in_batch(10, Embed::new(openai_client.clone()))
.log_all()
.filter_errors()
.then_store_with(
Qdrant::builder()
.batch_size(50)
.vector_size(1536)
.collection_name("swiftide-multi-vectors")
.with_vector(EmbeddedField::Chunk)
.with_vector(EmbeddedField::Metadata(metadata_qa_text::NAME.into()))
.with_vector(
VectorConfig::builder()
.embedded_field(EmbeddedField::Metadata(metadata_title::NAME.into()))
.distance(Distance::Manhattan)
.build()?,
)
.build()?,
)
.run()
.await?;

Prompt templating

breaking change Previously prompts were simple strings, lacking flexibility. Prompts now use Tera under the hood and can be fully customized with Jinja style templates.

When creating prompt templates, they are compiled and stored, until they are used with a context provided by i.e. a transformer.

Additionally, &str and String implement Into<Prompt>, so migrating should be straightforward by adding .into().

For example, customizing the prompt of MetadataQAText:

let template = PromptTemplate::try_compiled_from_str("Please generate {{questions}} questions for {{node.chunk}}")?;
MetadataQAText::builder()
.prompt_template(template)
.build()?;

Or prompting manually with the openai client:

// With a template
let template = PromptTemplate::try_compiled_from_str("hello {{world}}");
let prompt = template.to_prompt().with_context_value("world", "awesome");
openai_client.prompt(prompt) // The prompt will render to "hello awesome"
// Or directly with a (static) string
openai_client.prompt("Hello awesome".into())
// In fact, into converts to a prompt, so we can also template it!
// (but don't do this at scale, it will be slower than precompiling the template :))
let prompt: Prompt = "Hello {{world}}".into();
openai_client.prompt(prompt.with_context_value("world", "awesome"));

Tera has a lot more features than just substitution and we’re looking forward to the possibilities that this will bring.

What’s next?

We’re working hard on a query pipeline, a demo application and improving swiftide and documentation across the board.

Call for contributors

We cannot do everything alone and would love your help. There is a large list of desired features, and many more unlisted over at our issues page.


You can find the full changelog here.

To get started with Swiftide, head over to swiftide.rs or check us out on github.