Adventures in AI #11 : 1 + 1 = 1

I can’t believe it’s been 2 months since I wrote here! The fault lies with Twitter, of course. Since it collapsed as a learning space, I’ve been trying to find somewhere to get feedback and discussion and have settled on LinkedIn. Now, it’s an unusual space — self-aggrandizement seems to be the name of the game there for many (which makes sense, since it’s underlying premise is “hire me”) – but it does have a lot of people doing interesting things. So I have “blogged” a fair bit there, but it’s more pedagogical/theoretical than the practical side here. In particular, here’s an interview with one of my beta-testers, including two long discussions they had with chatOAME. I also spent much of March Break reading the current research (so much I went and bought a new tablet to replace my Kobo that died) adn I’ve posted a few good papers there.

The biggest change is that I have taken the “Tasks” part of the chatOAME, where the teacher can make content and I have merged it into the regular “chat” that makes up chatOAME. Because of the way that Forms works with the tool I’m using (MeowApps AIEngine, which I highly recommend – the developer is putting his all into this), it was one-and done. You asked for something and it made it and that was it. But that’s not what we want here — our goal is a conversation with the AI about education, and educational content. So, I had to find a way to bring that into the chat. (So, adding chat + tasks = chat)

ChatGPT (and custom GPTs) have “conversation starters” but, as you may know, when you’re planning a lesson, you don’t always know what content you want the students to engage with until you’re deep into the work. So it can’t be a “starter” all the time! It has to happen at any point in the conversation.

I added a toolbar above the prompt box that includes the tools I thought most useful, and when you click it, it puts in a textcode (well, [MC] for multiple choice questions) and then, after they add additional context, when the teacher clicks on Send, the Engine swaps out [MC] with the long prompt that describes best practices for multiple choice questions and sends it off to be done. This way, the teacher doesn’t have to remember to add all that to their prompt, but also when they get the Multiple Choice questions back, they can say “swap out #3 with a different [MC] question” and it will do that automagically. They can, of course, type [MC] instead of heading the button but this way they don’t have to remember codes. And, when they click send, [MC] changes into regular text so it “reads” better afterwards.

Here’s my initial prompt for Multiple Choice – now these prompts are not static and I’m always looking for improvement. (Unsurprisingly, I track all this in OneNote.)

Craft the requested number of multiple choice questions. Begin by including plausible distractors to ensure all options appear reasonable. Formulate the question to prompt higher-level thinking, encouraging students to apply principles in real-world scenarios. Ensure the question and options are grammatically coherent. Balance the placement of the correct answer and keep all options around the same length to avoid bias. Clarity in instructions is paramount, guiding students on how to approach the question. Each question should focus on a single, clearly defined problem aspect of the topic, with the main idea encapsulated within the question itself.

I’ve moved them out to a simple Markdown file on the server so that I can easily change them rather than playing with the code; it should also allow for me to specialize the toolbar for elementary/secondary (or, gasp, individuals! Long run folks!)

Now, I have to be honest... I struggled with this merging of Forms for the first of my two-week-long March Break without success, even with ChatGPT offering me help. It turns out, when you don’t know what you don’t know, you can’t ask the right questions! So, a very kind programmer read my frustration on Discord and wrote me the code in about three hours (and justified my frustration by saying that yes, it was a weird way that the whole thing worked, so it wasn’t just my inability). If you ever need web-programming, I have a guy for you!

Finally, my conversation with the programmer convinced me to try to include other AI Engines and so I brought Anthropic’s engine Claude in as an option. There’s only one BIG problem — there’s no way to add the same kind of lens to Claude that we have in OpenAI’s Assistants. So, when you ask it a question, it gives a really nice response — but it’s generic and isn’t bent towards the OAME’s Mission and Vision nor Ontario mathematics. For the moment, it’ll stay in testing and as we (and they) develop we’ll see if there are options to use it. If you haven’t tried Claude, it writes more naturally than ChatGPT4 — there’s a stiltedness (and an obsession with adjectives) to ChatGPT that I haven’t seen in Claude.

There won’t be too much development on chatOAME through April — school is now back on, and I have four conference sessions to prepare for and present. But, the mind always wanders so if I (or someone else) suggests something, you never know.

calarmstrong
calarmstrong
Articles: 30

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.