Combine Prompt with Selection and ChatGPT

Goal:

  1. Prompt to write an instruction to GPT of what I want it to do with the selected text. This I can select specific text in Drafts and do something different with it rathe than a predefine set of instructions.

  2. Select text and combine it with the instruction from prompt.

  3. Pass text with instruction to GPT

  4. Insert the result instead of the selected text.

I tried combining the two available script but I seem to be going in the wrong direction:

Script:

// Create a prompt

let f = () => {
	let p = new Prompt()
	p.title = "Ask ChatGPT"
	p.message = "What do you want ChatGPT to do with the selected text?"
	p.addTextView("instruction", "ChatGPT Prompt", "")
	p.addButton("Ask")
	
	if (!p.show()) {
		return false
	}
	const instruction = p.fieldValues["prompt"]
	if (chatPrompt.length == 0) { return false }

// Combine prompt with selected text

	// get editor values
	const [st, len] = editor.getSelectedRange()
	const selection = editor.getSelectedText()

	// build prompt
	const chatPrompt = `${instruction}: "${selection}"`

	// create OpenAI API object and use single response
	// convenience function to send prompt
	let ai = new OpenAI()
	let answer = ai.quickChatResponse(chatPrompt)

	// if we got a reply, add it to the draft
	if (answer && answer.length > 0) {
		answer = answer.trim().replace(/^"+|"+$/g, '')
		editor.setSelectedText(answer)
		editor.setSelectedRange(st, answer.length)
		return true
	}
	else {
		return false
	}
}

// Insert text
	editor.setSelectedText(content)
	return true
}

if (!f()) {
	context.cancel()
}

1 Like

The below should work. You had a couple of parsing issues, and changed some names (like “prompt” to “instruction”) in some places but not all places above.

Note that I also tweaked the prompt construction to automatically include the “in this text, return only the result” so in use you could just type “check spelling”, or “uppercase” (or whatever) in the prompt and get the transformation you want.

// get your input selected text, and store range for later
const selection = editor.getSelectedText()
const [st, len] = editor.getSelectedRange()

let f = () => {
	let p = new Prompt()
	p.title = "Ask ChatGPT"
	p.message = "What do you want ChatGPT to do with the selected text?"
	p.addTextView("instruction", "ChatGPT Prompt", "")
	p.addButton("Ask")
	
	if (!p.show()) {
		return false
	}
	const instruction = p.fieldValues["instruction"]
	if (instruction == 0) { return false }

	// build prompt with instruction and selection
	let chatPrompt = `${instruction} of the text, returning only the result: ${selection}`
	// send that to ChatGPT
	let ai = new OpenAI()
	let answer = ai.quickChatResponse(chatPrompt)

	// if we got no reply, cancel
	if (!answer || answer.length == 0) {
		return false
	}
	
	editor.setSelectedText(answer)
	editor.setSelectedRange(st, answer.length)
	return true
}

if (!f()) {
	context.fail()
}

Thank you. What does this do?

	editor.setSelectedRange(st, answer.length)

Updates the selected text range to match the length of the new text that was just placed in the editor.

I wrapped this up in an example action: ChatGPT: Modify Selection | Drafts Directory

1 Like

How can we modify this not to replace the text but to append the result below the selection?

You can remove those two lines from the script in the action:

editor.setSelectedText(answer)
editor.setSelectedRange(st, answer.length)

and replace it with

draft.append(answer)
draft.update()
1 Like

That sort of works, but places the answer at the end of the draft - which might not be what you want if the selection processed was in the middle of a longer draft.

I’d suggest instead just adding the original content to the answer value before inserting it, like:

// if we got no reply, cancel
if (!answer || answer.length == 0) {
	return false
}

// ADD THIS CODE
answer = `${selection}

${answer}`
// END ADD
	
editor.setSelectedText(answer)
editor.setSelectedRange(st, answer.length)
2 Likes

Thanks, @FlohGro and @agiletortoise. Both worked as you indicated. I do prefer the second one but why does it tab the first line of the output and render it in a different font? It didn’t do that for the first solution :thinking:

1 Like

If you copy-pasted, you might need to clean up the template and remove the tab. The string between the backticks will be treated literally, so if a tab is in there, it will be included in the result. In the script editor, you would want it to look like the below:

As for why it’s a different font, that’s Markdown. Indented string is treated as a code block in Markdown.

Thanks. Learning new things every day. Appreciate your patience.

As a long time user of Drafts, I have a general question that came up while using ChatGPT actions. I usually use a keyboard shortcut to execute the action. Since some actions take a while to complete, I sometimes miss a visual indication that lets me know that Drafts is (still) processing the action (i.e. in the case of ChatGPT, it’s a request waiting for a response). Is there a feature like this that I may have missed, or is it nonexistent? (I don’t even know if I pressed the hotkey correctly, or if I missed it and will wait forever. This has never been an issue for actions that take a split second to complete.)

Check out the last item on the change log for the latest beta that’s currently in testing :wink:

1 Like

I have been using ChatGPT: Modify Selection and have several prompts that generally work to create a summary, correct spelling errors, do punctuation, and create a bullet list of key points. My problem is that it takes forever to type out the prompts every time that I use this action.

Is there some way that I can create prompts that will automatically be inserted in “ChatGPT: Modify Selection” every time that I use this action?

Take a copy of the action and give it a name lik"Ai Summary".

Next, look through the script step for this line.

p.addTextView("instruction", "Prompt", "")

The double quotes at the end of nd is the default prompt. It is empty. Add your own text in and that will become your default prompt when using that action.

This means you can still tweak your prompt on the fly.

If you never want to do that, then you would need to look at removing the Drafts pop up prompt code and just passing your GPT prompt text straight in.

Thank you!! Your suggestion made all the difference!