How to chat with obsidian and ollama(local language model)?

ollama is a good way to run local languase model(like llama3 8b),and the AI can easily input the so much inorder data quickerly as we want,eg,we copy a lot of information of one person,we just paste in the ai ,and ask it to write in we already set table,it can quickerly write in correct location in the table insted of we one by one.And also ,chat with ai can easily darw picture of data .And the obsidian is a markdown soft,which can easily write so many things ,if we write in ob and upload the data to Nocodb,and then chat with the information auto .

Can you break down a bit more about what you are looking do?

Hopefully folks can point you in the right direction.