Incant – add magic spells to your code
21 points by montyanderson 4 weeks ago | 11 comments- thih9 3 weeks agoNote that this is a very simple library and not very efficient. E.g. for the code that filters an array, it would run N prompts[1]:
It's a cool demo, but I wouldn't use that in production; IMO having that code in a separate library offers little benefit and increases the risk of misuse.`You are a filter agent.\nYour job is to return whether an item matches the criteria: ${criteria}\nRespond only with true or false.`
[1]: https://github.com/montyanderson/incant/blob/73606e826d6e5b0...
- voidUpdate 3 weeks agoSo this is just asking an LLM to filter or select from an array? Where do the magic spells come in?
- helloplanets 3 weeks agoHow does this differ from function calling? For example, the basic enums example for Gemini function calling:
> color_temp: { type: Type.STRING, enum: ['daylight', 'cool', 'warm'], description: 'Color temperature of the light fixture, which can be `daylight`, `cool` or `warm`.', }
https://ai.google.dev/gemini-api/docs/function-calling?examp...
- supermatt 3 weeks agoIt’s the inverse of function calling. Here the function is calling the LLM, not vice versa.
- supermatt 3 weeks ago
- marcus_holmes 3 weeks agoI'm curious how the hallucination-free guarantee works? Does it only guarantee that the output is a subset of the input?
In the case of the male names, if I include a gender-neutral name like "Sam" does that include it because it is a male name, or exclude it because it is a female name? Can I set this to be inclusive or exclusive?
Looks interesting, though. Nice work.
- kinduff 3 weeks agoThere is a filter for `createFilter` [1] and there is a throw if the index of the array doesn't exist for `createSelector` [2]. Maybe this is what the author refers to as hallucination-free, but falls pretty short.
[1]: https://github.com/montyanderson/incant/blob/master/mod.ts#L...
[2]: https://github.com/montyanderson/incant/blob/master/mod.ts#L...
- marcus_holmes 3 weeks agoYeah, so it's just guaranteeing that the output is a subset of the inputs, thanks for the clarification.
- marcus_holmes 3 weeks ago
- kinduff 3 weeks ago
- jollyllama 3 weeks agovibecoding is a hell of a drug
- supermatt 3 weeks ago> no hallucinations possible
It can still hallucinate a response that is defined in the filter.
E.g if you have a filter with names of capital cities [“London”, “Paris”, “Madrid”] , and you ask “What is the capital of France” it could respond “Madrid”
- ycombinatrix 3 weeks agoIs that a hallucination, or is it just plain wrong?
- supermatt 3 weeks agoAn AI hallucination is any response that contains false or misleading information presented as fact. So a wrong answer is an hallucination.
- supermatt 3 weeks ago
- ycombinatrix 3 weeks ago