Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama create fails when using a utf16 Modelfile #4503

Closed
dehlong opened this issue May 17, 2024 · 19 comments · Fixed by #4533
Closed

Ollama create fails when using a utf16 Modelfile #4503

dehlong opened this issue May 17, 2024 · 19 comments · Fixed by #4533
Labels
bug Something isn't working

Comments

@dehlong
Copy link

dehlong commented May 17, 2024

What is the issue?

Hello,
I try to create a new model and mo matter what the model file is, 90% of the time I get:
Error: command must be one of "from", "license", "template", "system", "adapter", "parameter", or "message"

Is there any solution to this?
This is my modelfile:
FROM llama3
PARAMETER temperature 1
PARAMETER num_ctx 4096
SYSTEM You are Mario from super mario bros, acting as an assistant.

OS

Linux

GPU

Other

CPU

Intel

Ollama version

0.1.38

@dehlong dehlong added the bug Something isn't working label May 17, 2024
@brt-yilmaz
Copy link

Be careful about the spaces, first try to copy modelfile's string from : https://github.com/ollama/ollama/blob/main/docs/api.md#request-13
And than customise it.

@pdevine
Copy link
Contributor

pdevine commented May 18, 2024

@dehlong I just created a model using that modelfile and it worked fine. Was there anything else you had added into it?

@pdevine
Copy link
Contributor

pdevine commented May 18, 2024

% ollama create -f ~/Modelfiles/Mariomodelfile pdevine/mario
transferring model data
using existing layer sha256:00e1317cbf74d901080d7100f57580ba8dd8de57203072dc6f668324ba545f29
using existing layer sha256:4fa551d4f938f68b8c1e6afa9d28befb70e3f33f75d0753248d530364aeea40f
using existing layer sha256:8ab4849b038cf0abc5b1c9b8ee1443dca6b93a045c2272180d985126eb40bf6f
creating new layer sha256:278f3e552ef89955f0e5b42c48d52a37794179dc28d1caff2d5b8e8ff133e158
creating new layer sha256:40440ec37ef2b2862d182b7926987668264d13ff9c97407acf36a44106997f8f
creating new layer sha256:bd886c34b18cdaf5ed2d30acb3de0d3010c5546c6b1d259d5e75b2efe4a85c70
writing manifest
success
% ollama run pdevine/mario
>>> hi there
"It's-a me, Mario! Welcome to our little corner of the Mushroom Kingdom! I'm-a here to help you with anything you might need. Whether it's rescuing Princess Peach
from Bowser or finding the hidden Warp Pipes, I'm your guy! So, what seems to be the problem? Need some power-ups or maybe a map to the nearest castle?"

The Modelfile looks like:

FROM llama3
PARAMETER temperature 1
PARAMETER num_ctx 4096
SYSTEM You are Mario from super mario bros, acting as an assistant.

@dehlong
Copy link
Author

dehlong commented May 19, 2024

Nope, it was just this.

@duck1y
Copy link

duck1y commented May 19, 2024

I've been having the same issue on Windows for a few days.

At this point I'm sure it's not an issue with the model file.
use show --modelfile to copy the modelfile from an existing model and then use ollama create test58 -f ./Modelfile I get the same Error: command must be one of "from", "license", "template", "system", "adapter", "parameter", or "message" that Dehlong does.

I've uninstalled and reinstalled ollama. Using OpenWebUI model builder does work, but I need to use ollama create for reasons.

@dehlong I'll let you know if I trip into an answer but at this point I feel pretty stuck.

For clarity this has been working for me previously. I've used ollama create successfully dozens of times. It broke for me with the last update.

@pdevine
Copy link
Contributor

pdevine commented May 19, 2024

I'm wondering if this is just a problem w/ msdos files adding a carriage return+line feed to the end of each line in the file. Can either of you add the file to the issue? I think you should be able to drag+drop it into a comment.

@duck1y
Copy link

duck1y commented May 19, 2024

Modelfile.zip

Does this work?
I realize now the file type is showing up in windows explorer as a Program PhotonWorkShop file (anycubic slicer). unclear to me if this is a problem or not, unclear to me if it's likely to be the same issue Dehlong has considering he's on linux.

edit: TBC - having the same problem with other modelfiles that show up as type '3 file' and '6 file'

image

@pdevine
Copy link
Contributor

pdevine commented May 19, 2024

@duck1y Perfect. I was able to reproduce the problem using that file. Will try to sort out a fix now.

@pdevine
Copy link
Contributor

pdevine commented May 19, 2024

The problem turns out to be the file is a utf16 file and we're trying to parse it as a utf8 file. The temporary work around is you can convert the file in powershell using the command:

powershell "Get-Content 'Modelfile' | Out-File 'Newmodelfile' -Encoding ascii"

I have a fix I'm working on, which hopefully we can get into 0.1.39.

@duck1y
Copy link

duck1y commented May 19, 2024

@pdevine the workaround worked for me - thank you so much!! I appreciate you!!!

Is it clear that this is likely to be the source of @dehlong 's issue aswell?
@dehlong can you confirm this workaround works for you? I would hate to have hijacked your thread and left you without a solution.

Ty both :):)

@pdevine
Copy link
Contributor

pdevine commented May 19, 2024

@duck1y it's almost certainly the same problem.

@Anorid
Copy link

Anorid commented May 20, 2024

@pdevine If you can, the big guy can also help look at this error

@pdevine pdevine changed the title Creating an own model is not reliable Ollama create fails when using a utf16 Modelfile May 20, 2024
@pdevine
Copy link
Contributor

pdevine commented May 20, 2024

@Anorid can you create a new issue for that problem and post the Modelfile along with any relevant info (like if you're trying to use a converted model where you got the weights from)? This is definitely a different issue than the one you posted.

@Anorid
Copy link

Anorid commented May 20, 2024

I've created a new issue and posted the relevant information
#4529

@Patrickeik
Copy link

Patrickeik commented May 25, 2024

sorry. i try to do as instructed. still error: get-content LLMteacher-modelfile | out-file LLMtest -Encoding ascii
PS C:\Users\p\Documents\testing> ollama create LLMTeacher -f LLMtest

Error: command must be one of "from", "license", "template", "system", "adapter", "parameter", or "message"

windows 11

what do i do wrong

@Patrickeik
Copy link

here are more details:
PS C:\Users\peike\ollama> ollama show llama3 --modelfile > test
PS C:\Users\peike\ollama> ollama create ll3teacher -f test

Error: command must be one of "from", "license", "template", "system", "adapter", "parameter", or "message"
PS C:\Users\peike\ollama> Get-Content test | Out-File newtest -Encoding ascii
PS C:\Users\peike\ollama> ollama create ll3teacher -f newtest

Error: command must be one of "from", "license", "template", "system", "adapter", "parameter", or "message"
PS C:\Users\peike\ollama>

@donaldafeith
Copy link

donaldafeith commented May 30, 2024

For anyone having problems creating a model... (On Windows)
Get a text file put in it
FROM ./themodelfileyouwant
save text file
rename the text file Modelfile (no txt extension)
then go to cmd
ollama create thenameyouwanttonameyourmodel -f Modelfile
Hit enter and wait for the magic.

Not sure why it's not told in a straightforward manner but that's how I do it. Works great.

@pdevine
Copy link
Contributor

pdevine commented May 30, 2024

This should be fixed w/ 0.1.39 which now can parse a utf16 file (albeit only w/ 8 bit characters). There's another PR coming for allowing utf16 characters in the modelfile itself.

@Patrickeik
Copy link

thank you both!. workaround helped me! thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

7 participants