AI-Powered Token Classification

Use an LLM to automatically classify, tag, and enrich token metadata based on object properties.

What You'll Build

Token libraries grow fast, and manual tagging doesn't scale. In this tutorial you'll build an AI classification pipeline that reads token properties, uses an LLM to generate semantic tags, and writes those tags back to the objects, automatically organizing your entire collection.

Step 1, Set Up the Project

Create a new Node.js project with the required dependencies:

bash
mkdir dual-ai-classifier && cd dual-ai-classifier
npm init -y
npm install openai node-fetch

Step 2, Fetch Objects to Classify

Start by pulling a batch of untagged objects from your Dual instance:

javascript
const API = 'https://api-testnet.dual.network';
async function fetchUntaggedObjects(token, templateId) {
const res = await fetch(
\"\" + API + "/objects/search\",
{
method: 'POST',
headers: {
'Authorization': \"Bearer \" + token + "\",
'Content-Type': 'application/json'
},
body: JSON.stringify({
filters: {
template_id: templateId,
'properties.ai_tags': { '$exists': false }
},
limit: 50
})
}
);
return (await res.json()).results;
}

Step 3, Classify with an LLM

Send each object's properties to the LLM for semantic analysis and tag generation:

javascript
import OpenAI from 'openai';
const openai = new OpenAI();
async function classifyObject(obj) {
const completion = await openai.chat.completions.create({
model: 'gpt-4o-mini',
messages: [
{
role: 'system',
content: \"You are a token classification engine. Given an object's properties, return a JSON object with:
- "category": one of ["loyalty", "collectible", "access-pass", "certificate", "coupon", "identity"]
- "tags": array of 3-5 descriptive tags
- "sentiment": "positive", "neutral", or "negative"
- "summary": one-sentence description\"
},
{
role: 'user',
content: JSON.stringify(obj.properties)
}
],
response_format: { type: 'json_object' }
});
return JSON.parse(completion.choices[0].message.content);
}

Step 4, Write Tags Back to Dual

Update each object with the AI-generated classification:

javascript
async function tagObject(token, objectId, classification) {
await fetch(\"\" + API + "/objects/\" + objectId + "\", {
method: 'PATCH',
headers: {
'Authorization': \"Bearer \" + token + "\",
'Content-Type': 'application/json'
},
body: JSON.stringify({
properties: {
ai_tags: classification.tags,
ai_category: classification.category,
ai_summary: classification.summary,
ai_classified_at: new Date().toISOString()
}
})
});
}

Step 5, Run the Pipeline

Wire it all together into a batch classification pipeline:

javascript
async function runClassificationPipeline(token, templateId) {
const objects = await fetchUntaggedObjects(token, templateId);
console.log(\"Classifying \" + objects.length + " objects...\");
for (const obj of objects) {
const classification = await classifyObject(obj);
await tagObject(token, obj.id, classification);
console.log(\"Tagged \" + obj.id + ": \" + classification.category + "\");
}
console.log('Classification complete.');
}

Cost Optimization: Batch multiple objects into a single LLM call by sending an array of properties. This reduces API calls and cost by up to 10x. Use GPT-4o-mini or Claude Haiku for classification tasks, they're fast and cheap.

Companion Repo: Get the full working source code for this tutorial at github.com/orgs/DualOrg/dual-ai-classifier, clone it, add your API keys, and run it locally in minutes.

What's Next?

Now that your tokens are tagged, try Building a Conversational Token Assistant so users can query their collection with natural language.