Skip to content

LLAMASharp doesn't seem to work with any version > 0.8.1 #8

@JohnMcMahonDev

Description

@JohnMcMahonDev

I've tried a number of LLAMASharp versions, but in every version higher than 0.8.1 I get the following generic error when loading models:

Unhandled exception. System.TypeInitializationException: The type initializer for 'LLama.Native.NativeApi' threw an exception.
 ---> LLama.Exceptions.RuntimeError: The native library cannot be correctly loaded. It could be one of the following reasons:
1. No LLamaSharp backend was installed. Please search LLamaSharp.Backend and install one of them.
2. You are using a device with only CPU but installed cuda backend. Please install cpu backend instead.
3. One of the dependency of the native library is missed. Please use `ldd` on linux, `dumpbin` on windows and `otool`to check if all the dependency of the native library is satisfied. Generally you could find the libraries under your output folder.
4. Try to compile llama.cpp yourself to generate a libllama library, then use `LLama.Native.NativeLibraryConfig.WithLibrary` to specify it at the very beginning of your code. For more informations about compilation, please refer to LLamaSharp repo on github.

   at LLama.Native.NativeApi..cctor()
   --- End of inner exception stack trace ---
   at LLama.Native.NativeApi.llama_max_devices()
   at LLama.Abstractions.TensorSplitsCollection..ctor()
   at LLama.Common.ModelParams..ctor(String modelPath)
   at Program.<Main>$(String[] args) in D:\projects\testing\Llama\LLamaTesting\Program.cs:line 7
   at Program.<Main>(String[] args)

I've tried both the CPU and CUDA 12 backends for a number of versions, but each one gives the same error. Any ideas?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions