url
stringlengths
51
54
repository_url
stringclasses
1 value
labels_url
stringlengths
65
68
comments_url
stringlengths
60
63
events_url
stringlengths
58
61
html_url
stringlengths
39
44
id
int64
1.78B
2.82B
node_id
stringlengths
18
19
number
int64
1
8.69k
title
stringlengths
1
382
user
dict
labels
listlengths
0
5
state
stringclasses
2 values
locked
bool
1 class
assignee
dict
assignees
listlengths
0
2
milestone
null
comments
int64
0
323
created_at
timestamp[s]
updated_at
timestamp[s]
closed_at
timestamp[s]
author_association
stringclasses
4 values
sub_issues_summary
dict
active_lock_reason
null
draft
bool
2 classes
pull_request
dict
body
stringlengths
2
118k
closed_by
dict
reactions
dict
timeline_url
stringlengths
60
63
performed_via_github_app
null
state_reason
stringclasses
4 values
is_pull_request
bool
2 classes
https://api.github.com/repos/ollama/ollama/issues/4362
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4362/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4362/comments
https://api.github.com/repos/ollama/ollama/issues/4362/events
https://github.com/ollama/ollama/pull/4362
2,290,928,451
PR_kwDOJ0Z1Ps5vKS8y
4,362
fix `ollama create`'s usage string
{ "login": "todashuta", "id": 1555633, "node_id": "MDQ6VXNlcjE1NTU2MzM=", "avatar_url": "https://avatars.githubusercontent.com/u/1555633?v=4", "gravatar_id": "", "url": "https://api.github.com/users/todashuta", "html_url": "https://github.com/todashuta", "followers_url": "https://api.github.com/users/todashuta/followers", "following_url": "https://api.github.com/users/todashuta/following{/other_user}", "gists_url": "https://api.github.com/users/todashuta/gists{/gist_id}", "starred_url": "https://api.github.com/users/todashuta/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/todashuta/subscriptions", "organizations_url": "https://api.github.com/users/todashuta/orgs", "repos_url": "https://api.github.com/users/todashuta/repos", "events_url": "https://api.github.com/users/todashuta/events{/privacy}", "received_events_url": "https://api.github.com/users/todashuta/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-05-11T14:04:53
2024-05-12T02:39:20
2024-05-11T21:47:49
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/4362", "html_url": "https://github.com/ollama/ollama/pull/4362", "diff_url": "https://github.com/ollama/ollama/pull/4362.diff", "patch_url": "https://github.com/ollama/ollama/pull/4362.patch", "merged_at": "2024-05-11T21:47:49" }
Since `StringP()` automatically adds the initial value, the initial value description for Modelfile was duplicated. I have fixed this by removing the redundant default value description from the usage. Before: ``` $ ./ollama create -h Create a model from a Modelfile Usage: ollama create MODEL [flags] Flags: -f, --file string Name of the Modelfile (default "Modelfile") (default "Modelfile") -h, --help help for create -q, --quantize string Quantize model to this level (e.g. q4_0) Environment Variables: OLLAMA_HOST The host:port or base URL of the Ollama server (e.g. http://localhost:11434) ``` After: ``` $ ./ollama create -h Create a model from a Modelfile Usage: ollama create MODEL [flags] Flags: -f, --file string Name of the Modelfile (default "Modelfile") -h, --help help for create -q, --quantize string Quantize model to this level (e.g. q4_0) Environment Variables: OLLAMA_HOST The host:port or base URL of the Ollama server (e.g. http://localhost:11434) ```
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4362/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4362/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/8670
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8670/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8670/comments
https://api.github.com/repos/ollama/ollama/issues/8670/events
https://github.com/ollama/ollama/issues/8670
2,819,110,416
I_kwDOJ0Z1Ps6oCDYQ
8,670
Ollama official website API for fetching the models and its information
{ "login": "ALAWIII", "id": 60029291, "node_id": "MDQ6VXNlcjYwMDI5Mjkx", "avatar_url": "https://avatars.githubusercontent.com/u/60029291?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ALAWIII", "html_url": "https://github.com/ALAWIII", "followers_url": "https://api.github.com/users/ALAWIII/followers", "following_url": "https://api.github.com/users/ALAWIII/following{/other_user}", "gists_url": "https://api.github.com/users/ALAWIII/gists{/gist_id}", "starred_url": "https://api.github.com/users/ALAWIII/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ALAWIII/subscriptions", "organizations_url": "https://api.github.com/users/ALAWIII/orgs", "repos_url": "https://api.github.com/users/ALAWIII/repos", "events_url": "https://api.github.com/users/ALAWIII/events{/privacy}", "received_events_url": "https://api.github.com/users/ALAWIII/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" }, { "id": 6573197867, "node_id": "LA_kwDOJ0Z1Ps8AAAABh8sKKw", "url": "https://api.github.com/repos/ollama/ollama/labels/ollama.com", "name": "ollama.com", "color": "ffffff", "default": false, "description": "" } ]
open
false
null
[]
null
0
2025-01-29T19:40:32
2025-01-29T22:31:42
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
We need to build an API for the ollama website where all the models and their descriptions details are stored so that we can automatically fetch those data and embed them in our various apps!
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8670/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 1, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8670/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/2258
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2258/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2258/comments
https://api.github.com/repos/ollama/ollama/issues/2258/events
https://github.com/ollama/ollama/pull/2258
2,106,167,737
PR_kwDOJ0Z1Ps5lXLjp
2,258
docs: keep_alive
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
3
2024-01-29T18:31:25
2024-02-06T16:00:06
2024-02-06T16:00:05
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/2258", "html_url": "https://github.com/ollama/ollama/pull/2258", "diff_url": "https://github.com/ollama/ollama/pull/2258.diff", "patch_url": "https://github.com/ollama/ollama/pull/2258.patch", "merged_at": "2024-02-06T16:00:05" }
Document the `keep_alive` parameter which keeps the model loaded into memory
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2258/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2258/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/1170
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1170/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1170/comments
https://api.github.com/repos/ollama/ollama/issues/1170/events
https://github.com/ollama/ollama/issues/1170
1,998,558,099
I_kwDOJ0Z1Ps53H5OT
1,170
Allow LLMs to Query a Database Directly
{ "login": "FaizelK", "id": 24388421, "node_id": "MDQ6VXNlcjI0Mzg4NDIx", "avatar_url": "https://avatars.githubusercontent.com/u/24388421?v=4", "gravatar_id": "", "url": "https://api.github.com/users/FaizelK", "html_url": "https://github.com/FaizelK", "followers_url": "https://api.github.com/users/FaizelK/followers", "following_url": "https://api.github.com/users/FaizelK/following{/other_user}", "gists_url": "https://api.github.com/users/FaizelK/gists{/gist_id}", "starred_url": "https://api.github.com/users/FaizelK/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/FaizelK/subscriptions", "organizations_url": "https://api.github.com/users/FaizelK/orgs", "repos_url": "https://api.github.com/users/FaizelK/repos", "events_url": "https://api.github.com/users/FaizelK/events{/privacy}", "received_events_url": "https://api.github.com/users/FaizelK/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
null
[]
null
19
2023-11-17T08:41:58
2024-09-01T22:48:23
2024-05-09T23:03:10
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I have installed ollama and can run prompts, example: `ollama run llama2 "why is the sky blue?"` Is there any way to connect to MYSQL database and start asking about database data, example: `###### database file###### database.cnf host="localhost" user="admin" passsword="admin" database="mDatabase" ollama run llama2 database "how many users are administrators?" I have searched repository and I cannot see anything like this, I also went to langchain docs but did not see an example with ollama `
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1170/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1170/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6940
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6940/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6940/comments
https://api.github.com/repos/ollama/ollama/issues/6940/events
https://github.com/ollama/ollama/pull/6940
2,546,315,995
PR_kwDOJ0Z1Ps58kXZO
6,940
CI: Fix win arm version defect
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-09-24T20:21:17
2024-09-24T22:18:13
2024-09-24T22:18:11
COLLABORATOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/6940", "html_url": "https://github.com/ollama/ollama/pull/6940", "diff_url": "https://github.com/ollama/ollama/pull/6940.diff", "patch_url": "https://github.com/ollama/ollama/pull/6940.patch", "merged_at": "2024-09-24T22:18:10" }
Build 0.3.12-rc5 reports a pre-release string on win-arm due to the version not being set properly in CI. write-host in powershell writes directly to the console and will not be picked up by a pipe. Echo, or write-output will.
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6940/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6940/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/1833
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1833/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1833/comments
https://api.github.com/repos/ollama/ollama/issues/1833/events
https://github.com/ollama/ollama/pull/1833
2,068,998,839
PR_kwDOJ0Z1Ps5jZjC8
1,833
Dont use `-Wall` in static build
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-01-07T05:47:53
2024-01-07T15:39:20
2024-01-07T15:39:19
MEMBER
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/1833", "html_url": "https://github.com/ollama/ollama/pull/1833", "diff_url": "https://github.com/ollama/ollama/pull/1833.diff", "patch_url": "https://github.com/ollama/ollama/pull/1833.patch", "merged_at": "2024-01-07T15:39:19" }
Fixes this warning: ``` % go build . # github.com/jmorganca/ollama/llm cgo-gcc-prolog:153:33: warning: unused variable '_cgo_a' [-Wunused-variable] cgo-gcc-prolog:165:33: warning: unused variable '_cgo_a' [-Wunused-variable] ```
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1833/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1833/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/104
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/104/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/104/comments
https://api.github.com/repos/ollama/ollama/issues/104/events
https://github.com/ollama/ollama/pull/104
1,810,752,804
PR_kwDOJ0Z1Ps5V1CBF
104
use readline
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2023-07-18T21:32:24
2023-07-19T20:36:28
2023-07-19T20:36:24
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/104", "html_url": "https://github.com/ollama/ollama/pull/104", "diff_url": "https://github.com/ollama/ollama/pull/104.diff", "patch_url": "https://github.com/ollama/ollama/pull/104.patch", "merged_at": "2023-07-19T20:36:24" }
null
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/104/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/104/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/2341
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2341/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2341/comments
https://api.github.com/repos/ollama/ollama/issues/2341/events
https://github.com/ollama/ollama/pull/2341
2,116,779,679
PR_kwDOJ0Z1Ps5l7ja6
2,341
Revamp the windows tray code
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
1
2024-02-04T01:00:37
2024-02-04T18:45:06
2024-02-04T18:45:02
COLLABORATOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/2341", "html_url": "https://github.com/ollama/ollama/pull/2341", "diff_url": "https://github.com/ollama/ollama/pull/2341.diff", "patch_url": "https://github.com/ollama/ollama/pull/2341.patch", "merged_at": "2024-02-04T18:45:02" }
To get more control over our windows app this pulls the win32 logic into our Go code instead of using an upstream library. Still gobs of debug logging that I'll clean up soon, but it's now functional. The upgrade flow doesn't work yet of course.
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2341/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2341/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/6915
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6915/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6915/comments
https://api.github.com/repos/ollama/ollama/issues/6915/events
https://github.com/ollama/ollama/issues/6915
2,542,299,940
I_kwDOJ0Z1Ps6XiGsk
6,915
qwen2.5 can't stop answering
{ "login": "xutiange", "id": 16460665, "node_id": "MDQ6VXNlcjE2NDYwNjY1", "avatar_url": "https://avatars.githubusercontent.com/u/16460665?v=4", "gravatar_id": "", "url": "https://api.github.com/users/xutiange", "html_url": "https://github.com/xutiange", "followers_url": "https://api.github.com/users/xutiange/followers", "following_url": "https://api.github.com/users/xutiange/following{/other_user}", "gists_url": "https://api.github.com/users/xutiange/gists{/gist_id}", "starred_url": "https://api.github.com/users/xutiange/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/xutiange/subscriptions", "organizations_url": "https://api.github.com/users/xutiange/orgs", "repos_url": "https://api.github.com/users/xutiange/repos", "events_url": "https://api.github.com/users/xutiange/events{/privacy}", "received_events_url": "https://api.github.com/users/xutiange/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
9
2024-09-23T11:04:36
2024-10-16T08:21:49
2024-10-04T08:19:51
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? When I use qwen2.5, the model sometimes keeps responding and cannot be stopped. ### OS Linux, Docker ### GPU Nvidia ### CPU _No response_ ### Ollama version 0.3.11
{ "login": "xutiange", "id": 16460665, "node_id": "MDQ6VXNlcjE2NDYwNjY1", "avatar_url": "https://avatars.githubusercontent.com/u/16460665?v=4", "gravatar_id": "", "url": "https://api.github.com/users/xutiange", "html_url": "https://github.com/xutiange", "followers_url": "https://api.github.com/users/xutiange/followers", "following_url": "https://api.github.com/users/xutiange/following{/other_user}", "gists_url": "https://api.github.com/users/xutiange/gists{/gist_id}", "starred_url": "https://api.github.com/users/xutiange/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/xutiange/subscriptions", "organizations_url": "https://api.github.com/users/xutiange/orgs", "repos_url": "https://api.github.com/users/xutiange/repos", "events_url": "https://api.github.com/users/xutiange/events{/privacy}", "received_events_url": "https://api.github.com/users/xutiange/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6915/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6915/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5394
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5394/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5394/comments
https://api.github.com/repos/ollama/ollama/issues/5394/events
https://github.com/ollama/ollama/issues/5394
2,382,430,437
I_kwDOJ0Z1Ps6OAQDl
5,394
Ollama loads gemma2 27b with --ctx-size 16384
{ "login": "chigkim", "id": 22120994, "node_id": "MDQ6VXNlcjIyMTIwOTk0", "avatar_url": "https://avatars.githubusercontent.com/u/22120994?v=4", "gravatar_id": "", "url": "https://api.github.com/users/chigkim", "html_url": "https://github.com/chigkim", "followers_url": "https://api.github.com/users/chigkim/followers", "following_url": "https://api.github.com/users/chigkim/following{/other_user}", "gists_url": "https://api.github.com/users/chigkim/gists{/gist_id}", "starred_url": "https://api.github.com/users/chigkim/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/chigkim/subscriptions", "organizations_url": "https://api.github.com/users/chigkim/orgs", "repos_url": "https://api.github.com/users/chigkim/repos", "events_url": "https://api.github.com/users/chigkim/events{/privacy}", "received_events_url": "https://api.github.com/users/chigkim/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
1
2024-06-30T20:28:35
2024-08-16T00:38:18
2024-08-16T00:38:17
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? Isn't gemma2 context size 8192? When I access GEMMA2:27b-instruct-q8_0 with OpenAI api, it loads the model with `--ctx-size 16384` according to log. ### OS macOS ### GPU Apple ### CPU Apple ### Ollama version 0.1.48
{ "login": "chigkim", "id": 22120994, "node_id": "MDQ6VXNlcjIyMTIwOTk0", "avatar_url": "https://avatars.githubusercontent.com/u/22120994?v=4", "gravatar_id": "", "url": "https://api.github.com/users/chigkim", "html_url": "https://github.com/chigkim", "followers_url": "https://api.github.com/users/chigkim/followers", "following_url": "https://api.github.com/users/chigkim/following{/other_user}", "gists_url": "https://api.github.com/users/chigkim/gists{/gist_id}", "starred_url": "https://api.github.com/users/chigkim/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/chigkim/subscriptions", "organizations_url": "https://api.github.com/users/chigkim/orgs", "repos_url": "https://api.github.com/users/chigkim/repos", "events_url": "https://api.github.com/users/chigkim/events{/privacy}", "received_events_url": "https://api.github.com/users/chigkim/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5394/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5394/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/7389
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7389/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7389/comments
https://api.github.com/repos/ollama/ollama/issues/7389/events
https://github.com/ollama/ollama/pull/7389
2,616,963,532
PR_kwDOJ0Z1Ps6ABt-B
7,389
chore: update llama.h
{ "login": "eltociear", "id": 22633385, "node_id": "MDQ6VXNlcjIyNjMzMzg1", "avatar_url": "https://avatars.githubusercontent.com/u/22633385?v=4", "gravatar_id": "", "url": "https://api.github.com/users/eltociear", "html_url": "https://github.com/eltociear", "followers_url": "https://api.github.com/users/eltociear/followers", "following_url": "https://api.github.com/users/eltociear/following{/other_user}", "gists_url": "https://api.github.com/users/eltociear/gists{/gist_id}", "starred_url": "https://api.github.com/users/eltociear/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/eltociear/subscriptions", "organizations_url": "https://api.github.com/users/eltociear/orgs", "repos_url": "https://api.github.com/users/eltociear/repos", "events_url": "https://api.github.com/users/eltociear/events{/privacy}", "received_events_url": "https://api.github.com/users/eltociear/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
1
2024-10-27T23:49:23
2024-10-28T23:17:52
2024-10-28T23:17:51
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/7389", "html_url": "https://github.com/ollama/ollama/pull/7389", "diff_url": "https://github.com/ollama/ollama/pull/7389.diff", "patch_url": "https://github.com/ollama/ollama/pull/7389.patch", "merged_at": null }
indicies -> indices
{ "login": "jessegross", "id": 6468499, "node_id": "MDQ6VXNlcjY0Njg0OTk=", "avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jessegross", "html_url": "https://github.com/jessegross", "followers_url": "https://api.github.com/users/jessegross/followers", "following_url": "https://api.github.com/users/jessegross/following{/other_user}", "gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}", "starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jessegross/subscriptions", "organizations_url": "https://api.github.com/users/jessegross/orgs", "repos_url": "https://api.github.com/users/jessegross/repos", "events_url": "https://api.github.com/users/jessegross/events{/privacy}", "received_events_url": "https://api.github.com/users/jessegross/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7389/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7389/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/7371
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7371/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7371/comments
https://api.github.com/repos/ollama/ollama/issues/7371/events
https://github.com/ollama/ollama/issues/7371
2,615,459,449
I_kwDOJ0Z1Ps6b5L55
7,371
加载qwen2.5-1.5b-instruct-fp16.gguf运行非常慢
{ "login": "czhcc", "id": 4754730, "node_id": "MDQ6VXNlcjQ3NTQ3MzA=", "avatar_url": "https://avatars.githubusercontent.com/u/4754730?v=4", "gravatar_id": "", "url": "https://api.github.com/users/czhcc", "html_url": "https://github.com/czhcc", "followers_url": "https://api.github.com/users/czhcc/followers", "following_url": "https://api.github.com/users/czhcc/following{/other_user}", "gists_url": "https://api.github.com/users/czhcc/gists{/gist_id}", "starred_url": "https://api.github.com/users/czhcc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/czhcc/subscriptions", "organizations_url": "https://api.github.com/users/czhcc/orgs", "repos_url": "https://api.github.com/users/czhcc/repos", "events_url": "https://api.github.com/users/czhcc/events{/privacy}", "received_events_url": "https://api.github.com/users/czhcc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
2
2024-10-26T02:14:25
2024-10-30T05:38:15
2024-10-30T05:38:15
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? 但如果从仓库下载的qwen2.5-1.5b量化版还是很快的 ### OS Linux, Docker ### GPU Nvidia ### CPU Intel ### Ollama version 0.3.10
{ "login": "czhcc", "id": 4754730, "node_id": "MDQ6VXNlcjQ3NTQ3MzA=", "avatar_url": "https://avatars.githubusercontent.com/u/4754730?v=4", "gravatar_id": "", "url": "https://api.github.com/users/czhcc", "html_url": "https://github.com/czhcc", "followers_url": "https://api.github.com/users/czhcc/followers", "following_url": "https://api.github.com/users/czhcc/following{/other_user}", "gists_url": "https://api.github.com/users/czhcc/gists{/gist_id}", "starred_url": "https://api.github.com/users/czhcc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/czhcc/subscriptions", "organizations_url": "https://api.github.com/users/czhcc/orgs", "repos_url": "https://api.github.com/users/czhcc/repos", "events_url": "https://api.github.com/users/czhcc/events{/privacy}", "received_events_url": "https://api.github.com/users/czhcc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7371/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7371/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2783
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2783/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2783/comments
https://api.github.com/repos/ollama/ollama/issues/2783/events
https://github.com/ollama/ollama/issues/2783
2,156,801,774
I_kwDOJ0Z1Ps6Aji7u
2,783
Connection Error with OllamaFunctions in Langchain
{ "login": "quartermaine", "id": 24212117, "node_id": "MDQ6VXNlcjI0MjEyMTE3", "avatar_url": "https://avatars.githubusercontent.com/u/24212117?v=4", "gravatar_id": "", "url": "https://api.github.com/users/quartermaine", "html_url": "https://github.com/quartermaine", "followers_url": "https://api.github.com/users/quartermaine/followers", "following_url": "https://api.github.com/users/quartermaine/following{/other_user}", "gists_url": "https://api.github.com/users/quartermaine/gists{/gist_id}", "starred_url": "https://api.github.com/users/quartermaine/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/quartermaine/subscriptions", "organizations_url": "https://api.github.com/users/quartermaine/orgs", "repos_url": "https://api.github.com/users/quartermaine/repos", "events_url": "https://api.github.com/users/quartermaine/events{/privacy}", "received_events_url": "https://api.github.com/users/quartermaine/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396220, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA", "url": "https://api.github.com/repos/ollama/ollama/labels/question", "name": "question", "color": "d876e3", "default": true, "description": "General questions" }, { "id": 6677367769, "node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q", "url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info", "name": "needs more info", "color": "BA8041", "default": false, "description": "More information is needed to assist" } ]
closed
false
null
[]
null
6
2024-02-27T14:54:24
2025-01-12T00:52:08
2025-01-12T00:52:08
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### Description I am attempting to replicate the [Langchain tutorial](https://python.langchain.com/docs/integrations/chat/ollama_functions) in order to use OllamaFunctions for web extraction, as also demonstrated [here](https://python.langchain.com/docs/use_cases/web_scraping#scraping-with-extraction) in a Google Colab environment. ### Code ```python [1] %%capture !pip install langchain_experimental [2] from langchain_experimental.llms.ollama_functions import OllamaFunctions lm = OllamaFunctions(model="llama2:13b", base_url="http://localhost:11434", temperature=0) [3] %%capture !pip install -q langchain-openai langchain playwright beautifulsoup4 !playwright install [4] import nest_asyncio nest_asyncio.apply() [5] from langchain.chains import create_extraction_chain schema = { "properties": { "news_article_title": {"type": "string"}, "news_article_summary": {"type": "string"}, }, "required": ["news_article_title", "news_article_summary"], } def extract(content: str, schema: dict): return create_extraction_chain(schema=schema, llm=llm, verbose=True).invoke(content) [6] import pprint from langchain.text_splitter import RecursiveCharacterTextSplitter from langchain_community.document_loaders import AsyncChromiumLoader from langchain_community.document_transformers import BeautifulSoupTransformer def scrape_with_playwright(urls, schema): loader = AsyncChromiumLoader(urls) docs = loader.load() bs_transformer = BeautifulSoupTransformer() docs_transformed = bs_transformer.transform_documents( docs, tags_to_extract=["span"] ) print("Extracting content with LLM") # Grab the first 1000 tokens of the site splitter = RecursiveCharacterTextSplitter.from_tiktoken_encoder( chunk_size=1000, chunk_overlap=0, separators=["\n"] ) splits = splitter.split_documents(docs_transformed) print("Number of splits:", len(splits)) # Add this debugging statement if splits: # Check if splits list is not empty # Process the first split extracted_content = extract(schema=schema, content=splits[0].page_content) # Line where error occurs pprint.pprint(extracted_content) return extracted_content else: print("No splits found") # Add this debugging statement return None``` [7] urls = ["https://www.nytimes.com/"] extracted_content = scrape_with_playwright(urls, schema=schema) python ``` ### Error But I am getting the following error: ```python ConnectionError: HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: /api/chat/ (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f7b19911300>: Failed to establish a new connection: [Errno 111] Connection refused')) ```
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/followers", "following_url": "https://api.github.com/users/pdevine/following{/other_user}", "gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}", "starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pdevine/subscriptions", "organizations_url": "https://api.github.com/users/pdevine/orgs", "repos_url": "https://api.github.com/users/pdevine/repos", "events_url": "https://api.github.com/users/pdevine/events{/privacy}", "received_events_url": "https://api.github.com/users/pdevine/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2783/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2783/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/1979
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1979/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1979/comments
https://api.github.com/repos/ollama/ollama/issues/1979/events
https://github.com/ollama/ollama/issues/1979
2,080,466,913
I_kwDOJ0Z1Ps58AWfh
1,979
Unable to get Ollama to utilize GPU on Jetson Orin Nano 8Gb
{ "login": "remy415", "id": 105550370, "node_id": "U_kgDOBkqSIg", "avatar_url": "https://avatars.githubusercontent.com/u/105550370?v=4", "gravatar_id": "", "url": "https://api.github.com/users/remy415", "html_url": "https://github.com/remy415", "followers_url": "https://api.github.com/users/remy415/followers", "following_url": "https://api.github.com/users/remy415/following{/other_user}", "gists_url": "https://api.github.com/users/remy415/gists{/gist_id}", "starred_url": "https://api.github.com/users/remy415/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/remy415/subscriptions", "organizations_url": "https://api.github.com/users/remy415/orgs", "repos_url": "https://api.github.com/users/remy415/repos", "events_url": "https://api.github.com/users/remy415/events{/privacy}", "received_events_url": "https://api.github.com/users/remy415/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
81
2024-01-13T20:37:34
2024-04-20T02:47:34
2024-03-25T19:51:02
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I've reviewed the great tutorial made by @bnodnarb here: https://github.com/jmorganca/ollama/blob/main/docs/tutorials/nvidia-jetson.md The Orin Nano is running Ubuntu 20.04 with Jetpack 5.1.2 (r35.4.1 L4T). The container is also running L4T version 35.4.1. Jetpack 5.1.2 comes with CUDA 11.4 installed with compatibility support for CUDA 11.8. I also followed along with the other 3 Jetson-related issues and have not found a fix. I have also: Run ollama serve - with and without tmux - with and without tmux and LD_LIBRARY_PATH='/usr/local/cuda/lib64' - Using dustynv/stable-diffusion-webui:r35.4.1 container, installed ollama and ensured env variables set - Note: This container is able to provide accelerated processing of stable-diffusion-webui as-is In each of the situations, I used the 'mistral-jetson' generated model. For each of them, I get a similar output: ```2024/01/13 20:14:02 images.go:808: total blobs: 7 2024/01/13 20:14:02 images.go:815: total unused blobs removed: 0 2024/01/13 20:14:02 routes.go:930: Listening on 127.0.0.1:11434 (version 0.1.20) 2024/01/13 20:14:03 shim_ext_server.go:142: Dynamic LLM variants [cuda] 2024/01/13 20:14:03 gpu.go:88: Detecting GPU type 2024/01/13 20:14:03 gpu.go:203: Searching for GPU management library libnvidia-ml.so 2024/01/13 20:14:03 gpu.go:248: Discovered GPU libraries: [] 2024/01/13 20:14:03 gpu.go:203: Searching for GPU management library librocm_smi64.so 2024/01/13 20:14:03 gpu.go:248: Discovered GPU libraries: [] 2024/01/13 20:14:03 routes.go:953: no GPU detected [GIN] 2024/01/13 - 20:14:28 | 200 | 73.666µs | 127.0.0.1 | HEAD "/" [GIN] 2024/01/13 - 20:14:28 | 200 | 1.154281ms | 127.0.0.1 | POST "/api/show" [GIN] 2024/01/13 - 20:14:28 | 200 | 644.279µs | 127.0.0.1 | POST "/api/show" 2024/01/13 20:14:28 llm.go:71: GPU not available, falling back to CPU 2024/01/13 20:14:28 ext_server_common.go:136: Initializing internal llama server (... llama_model_loading) llm_load_vocab: special tokens definition check successful ( 259/32000 ). llm_load_print_meta: format = GGUF V3 (latest) llm_load_print_meta: arch = llama llm_load_print_meta: vocab type = SPM llm_load_print_meta: n_vocab = 32000 llm_load_print_meta: n_merges = 0 llm_load_print_meta: n_ctx_train = 32768 llm_load_print_meta: n_embd = 4096 llm_load_print_meta: n_head = 32 llm_load_print_meta: n_head_kv = 8 llm_load_print_meta: n_layer = 32 llm_load_print_meta: n_rot = 128 llm_load_print_meta: n_gqa = 4 llm_load_print_meta: f_norm_eps = 0.0e+00 llm_load_print_meta: f_norm_rms_eps = 1.0e-05 llm_load_print_meta: f_clamp_kqv = 0.0e+00 llm_load_print_meta: f_max_alibi_bias = 0.0e+00 llm_load_print_meta: n_ff = 14336 llm_load_print_meta: n_expert = 0 llm_load_print_meta: n_expert_used = 0 llm_load_print_meta: rope scaling = linear llm_load_print_meta: freq_base_train = 1000000.0 llm_load_print_meta: freq_scale_train = 1 llm_load_print_meta: n_yarn_orig_ctx = 32768 llm_load_print_meta: rope_finetuned = unknown llm_load_print_meta: model type = 7B llm_load_print_meta: model ftype = Q4_0 llm_load_print_meta: model params = 7.24 B llm_load_print_meta: model size = 3.83 GiB (4.54 BPW) llm_load_print_meta: general.name = mistralai llm_load_print_meta: BOS token = 1 '<s>' llm_load_print_meta: EOS token = 2 '</s>' llm_load_print_meta: UNK token = 0 '<unk>' llm_load_print_meta: LF token = 13 '<0x0A>' llm_load_tensors: ggml ctx size = 0.11 MiB llm_load_tensors: mem required = 3917.98 MiB ................................................................................................... llama_new_context_with_model: n_ctx = 2048 llama_new_context_with_model: freq_base = 1000000.0 llama_new_context_with_model: freq_scale = 1 llama_new_context_with_model: KV self size = 256.00 MiB, K (f16): 128.00 MiB, V (f16): 128.00 MiB llama_build_graph: non-view tensors processed: 676/676 llama_new_context_with_model: compute buffer total size = 159.19 MiB 2024/01/13 20:14:31 ext_server_common.go:144: Starting internal llama main loop [GIN] 2024/01/13 - 20:14:31 | 200 | 3.017526003s | 127.0.0.1 | POST "/api/generate" 2024/01/13 20:14:48 ext_server_common.go:158: loaded 0 images [GIN] 2024/01/13 - 20:15:04 | 200 | 16.039682856s | 127.0.0.1 | POST "/api/generate" ``` Key outputs are: `2024/01/13 20:14:03 routes.go:953: no GPU detected` `llm_load_tensors: mem required = 3917.98 MiB` Again, would just like to note that the stable-diffusion-webui application works with GPU, as well as the referenced docker container from dustynv. Any suggestions of things to check? Update: I forgot to mention that I verified CPU and GPU activity using jtop in another terminal. Edited for formatting. Edited to add OS & Jetson versions. Edited to add CUDA version.
{ "login": "remy415", "id": 105550370, "node_id": "U_kgDOBkqSIg", "avatar_url": "https://avatars.githubusercontent.com/u/105550370?v=4", "gravatar_id": "", "url": "https://api.github.com/users/remy415", "html_url": "https://github.com/remy415", "followers_url": "https://api.github.com/users/remy415/followers", "following_url": "https://api.github.com/users/remy415/following{/other_user}", "gists_url": "https://api.github.com/users/remy415/gists{/gist_id}", "starred_url": "https://api.github.com/users/remy415/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/remy415/subscriptions", "organizations_url": "https://api.github.com/users/remy415/orgs", "repos_url": "https://api.github.com/users/remy415/repos", "events_url": "https://api.github.com/users/remy415/events{/privacy}", "received_events_url": "https://api.github.com/users/remy415/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1979/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1979/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/688
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/688/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/688/comments
https://api.github.com/repos/ollama/ollama/issues/688/events
https://github.com/ollama/ollama/issues/688
1,923,366,960
I_kwDOJ0Z1Ps5ypEAw
688
Unable to create account with a secure password
{ "login": "FairyTail2000", "id": 22645621, "node_id": "MDQ6VXNlcjIyNjQ1NjIx", "avatar_url": "https://avatars.githubusercontent.com/u/22645621?v=4", "gravatar_id": "", "url": "https://api.github.com/users/FairyTail2000", "html_url": "https://github.com/FairyTail2000", "followers_url": "https://api.github.com/users/FairyTail2000/followers", "following_url": "https://api.github.com/users/FairyTail2000/following{/other_user}", "gists_url": "https://api.github.com/users/FairyTail2000/gists{/gist_id}", "starred_url": "https://api.github.com/users/FairyTail2000/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/FairyTail2000/subscriptions", "organizations_url": "https://api.github.com/users/FairyTail2000/orgs", "repos_url": "https://api.github.com/users/FairyTail2000/repos", "events_url": "https://api.github.com/users/FairyTail2000/events{/privacy}", "received_events_url": "https://api.github.com/users/FairyTail2000/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
6
2023-10-03T05:55:38
2023-10-12T00:35:30
2023-10-12T00:35:29
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I was just about to create an account with the following passwords: »,àî´æ=`"((Ý#±«"ü×%'yWðÍ&îPqØTX;¯þ;¿×æX˵¾ÖÛDþí,Á_+*ĬÊ<µ¾¡f'»êÎÖp¢e_P°óZk@XñÊ7ÒÊÖ©mðÂÝs5jÛCCýZ-C¹ÎÖúÃ'ô½¡7§îW(ÂcT_*Jo©h9>9Ãèh[Í pw_E}]kz#uEnn`Lr@[FF{jfS+~M*rd/52iWxja%jobADqcWX\oaZ[;=bPM].5Kc(gJH-_+okZbeQ'wQ_nVVQV-C{r3/7}+%#:{,->y.K,'A-M/fR9gw%*H}!H3=LY!{u Shortening the password to 50 chars works. However I would still like to use the max number of chars available in my password manager
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/688/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/688/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/3054
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3054/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3054/comments
https://api.github.com/repos/ollama/ollama/issues/3054/events
https://github.com/ollama/ollama/issues/3054
2,179,435,900
I_kwDOJ0Z1Ps6B5418
3,054
Immense amount of disk reads when paging with `mmap`
{ "login": "hedleyroos", "id": 316314, "node_id": "MDQ6VXNlcjMxNjMxNA==", "avatar_url": "https://avatars.githubusercontent.com/u/316314?v=4", "gravatar_id": "", "url": "https://api.github.com/users/hedleyroos", "html_url": "https://github.com/hedleyroos", "followers_url": "https://api.github.com/users/hedleyroos/followers", "following_url": "https://api.github.com/users/hedleyroos/following{/other_user}", "gists_url": "https://api.github.com/users/hedleyroos/gists{/gist_id}", "starred_url": "https://api.github.com/users/hedleyroos/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/hedleyroos/subscriptions", "organizations_url": "https://api.github.com/users/hedleyroos/orgs", "repos_url": "https://api.github.com/users/hedleyroos/repos", "events_url": "https://api.github.com/users/hedleyroos/events{/privacy}", "received_events_url": "https://api.github.com/users/hedleyroos/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 5755339642, "node_id": "LA_kwDOJ0Z1Ps8AAAABVwuDeg", "url": "https://api.github.com/repos/ollama/ollama/labels/linux", "name": "linux", "color": "516E70", "default": false, "description": "" } ]
closed
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
7
2024-03-11T15:29:17
2024-08-02T08:22:41
2024-07-24T23:10:07
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Querying the `/generate` API with the llama2 models works as expected. I append `keep_alive=0` to the query string to keep the model in RAM, and from `iotop` I can see it immediately loads the model in RAM (I am in CPU only mode). The loading also seems to take place in a single sub-process or thread - unsure which since I don't know the underlying design, but it looks like a sub-process. However, when switching to the llama2:70b model things change. It does the same single process loading of the model and takes longer because the model is naturally bigger, but thereafter 16 ollama processes are constantly reading from disk at about 140MB/s, and as a result generation is extremely slow. I have a 7950x, so the 16 processes make sense, but I do have 64GB RAM, and this should be enough for the model. `top` shows RAM usage at 34GB, but since there is memory mapping behind the scenes virt goes to about 46GB as expected. I cannot find a way to prevent ollama from constantly hammering the disk with the llama2:70b model. Unfortunately I don't know Golang, but could it be an issue with the memory mapping not being correctly shared between processes?
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3054/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3054/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/553
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/553/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/553/comments
https://api.github.com/repos/ollama/ollama/issues/553/events
https://github.com/ollama/ollama/pull/553
1,902,219,201
PR_kwDOJ0Z1Ps5ao06Y
553
add word wrapping for lines which are longer than the terminal width
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/followers", "following_url": "https://api.github.com/users/pdevine/following{/other_user}", "gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}", "starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pdevine/subscriptions", "organizations_url": "https://api.github.com/users/pdevine/orgs", "repos_url": "https://api.github.com/users/pdevine/repos", "events_url": "https://api.github.com/users/pdevine/events{/privacy}", "received_events_url": "https://api.github.com/users/pdevine/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2023-09-19T05:15:36
2023-09-26T23:24:35
2023-09-22T20:36:08
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/553", "html_url": "https://github.com/ollama/ollama/pull/553", "diff_url": "https://github.com/ollama/ollama/pull/553.diff", "patch_url": "https://github.com/ollama/ollama/pull/553.patch", "merged_at": "2023-09-22T20:36:08" }
This change makes it so the REPL will properly wrap a line on a word boundary. The way it works is that it walks through each character of each token returned by the server, and then keeps a buffer of the last word. If the maximum boundary length is exceeded, it will backtrack using ANSI escape codes to the length of the current word buffer, erase to the end of the line, give a line feed, and then add the word fragment to the new line. This requires that the terminal allow ANSI graphics which should be OK for any modern terminal. If you run this headless, it will default to not wrapping any lines. Right now I've set the width to 5 characters less than the terminal width, but we can potentially make this a setting in the future. This fixes issue #150
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/followers", "following_url": "https://api.github.com/users/pdevine/following{/other_user}", "gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}", "starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pdevine/subscriptions", "organizations_url": "https://api.github.com/users/pdevine/orgs", "repos_url": "https://api.github.com/users/pdevine/repos", "events_url": "https://api.github.com/users/pdevine/events{/privacy}", "received_events_url": "https://api.github.com/users/pdevine/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/553/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/553/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/974
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/974/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/974/comments
https://api.github.com/repos/ollama/ollama/issues/974/events
https://github.com/ollama/ollama/pull/974
1,974,951,437
PR_kwDOJ0Z1Ps5eeH0F
974
remove modelfile context deprecated in v0.0.7
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2023-11-02T20:02:47
2023-11-03T00:52:57
2023-11-03T00:52:56
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/974", "html_url": "https://github.com/ollama/ollama/pull/974", "diff_url": "https://github.com/ollama/ollama/pull/974.diff", "patch_url": "https://github.com/ollama/ollama/pull/974.patch", "merged_at": "2023-11-03T00:52:56" }
This modelfile variable was deprecated in ollama v0.0.7, which was a very early stage of the project. It was also not documented anywhere, and no longer used in our library images. It should be ok to remove this now.
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/974/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/974/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/5817
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5817/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5817/comments
https://api.github.com/repos/ollama/ollama/issues/5817/events
https://github.com/ollama/ollama/pull/5817
2,421,077,019
PR_kwDOJ0Z1Ps51_y_P
5,817
llm: consider `head_dim` in llama arch
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-07-20T22:11:50
2024-07-21T01:48:14
2024-07-21T01:48:12
MEMBER
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/5817", "html_url": "https://github.com/ollama/ollama/pull/5817", "diff_url": "https://github.com/ollama/ollama/pull/5817.diff", "patch_url": "https://github.com/ollama/ollama/pull/5817.patch", "merged_at": "2024-07-21T01:48:12" }
null
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5817/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5817/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/7788
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7788/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7788/comments
https://api.github.com/repos/ollama/ollama/issues/7788/events
https://github.com/ollama/ollama/issues/7788
2,681,946,191
I_kwDOJ0Z1Ps6f20BP
7,788
Ollama 0.4.3 ignores HTTPS_PROXY
{ "login": "0xmeyer", "id": 125983009, "node_id": "U_kgDOB4JZIQ", "avatar_url": "https://avatars.githubusercontent.com/u/125983009?v=4", "gravatar_id": "", "url": "https://api.github.com/users/0xmeyer", "html_url": "https://github.com/0xmeyer", "followers_url": "https://api.github.com/users/0xmeyer/followers", "following_url": "https://api.github.com/users/0xmeyer/following{/other_user}", "gists_url": "https://api.github.com/users/0xmeyer/gists{/gist_id}", "starred_url": "https://api.github.com/users/0xmeyer/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/0xmeyer/subscriptions", "organizations_url": "https://api.github.com/users/0xmeyer/orgs", "repos_url": "https://api.github.com/users/0xmeyer/repos", "events_url": "https://api.github.com/users/0xmeyer/events{/privacy}", "received_events_url": "https://api.github.com/users/0xmeyer/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
3
2024-11-22T06:13:07
2024-11-25T23:08:35
2024-11-25T23:08:35
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? Since Ollama `0.4.3` the environment variable `HTTPS_RPOXY` is ignored. My old deployment ran with Ollama `0.4.1` without any problems. Should be reproducible: ```bash $ export HTTPS_PROXY=http://<PROXY>:3128 $ ollama -v Warning: could not connect to a running Ollama instance Warning: client version is 0.4.3 $ ollama-0.4.1 -v Warning: could not connect to a running Ollama instance Warning: client version is 0.4.1 $ ollama serve &> /tmp/ollama-0.4.3.log & $ ollama pull llama3.2 pulling manifest ⠦ # kill ollama instance and use v0.4.1 kill %1 ollama-0.4.1 serve &> /tmp/ollama-0.4.1.log & $ ollama pull llama3.2 pulling manifest pulling dde5aa3fc5ff... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 2.0 GB pulling 966de95ca8a6... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 1.4 KB pulling fcc5a6bec9da... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 7.7 KB pulling a70ff7e570d9... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 6.0 KB pulling 56bb8bd477a5... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 96 B pulling 34bb5ab01051... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 561 B verifying sha256 digest writing manifest success ``` ### OS Linux ### GPU Nvidia ### CPU _No response_ ### Ollama version 0.4.3
{ "login": "bmizerany", "id": 46, "node_id": "MDQ6VXNlcjQ2", "avatar_url": "https://avatars.githubusercontent.com/u/46?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bmizerany", "html_url": "https://github.com/bmizerany", "followers_url": "https://api.github.com/users/bmizerany/followers", "following_url": "https://api.github.com/users/bmizerany/following{/other_user}", "gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}", "starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions", "organizations_url": "https://api.github.com/users/bmizerany/orgs", "repos_url": "https://api.github.com/users/bmizerany/repos", "events_url": "https://api.github.com/users/bmizerany/events{/privacy}", "received_events_url": "https://api.github.com/users/bmizerany/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7788/reactions", "total_count": 4, "+1": 4, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7788/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/3221
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3221/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3221/comments
https://api.github.com/repos/ollama/ollama/issues/3221/events
https://github.com/ollama/ollama/issues/3221
2,191,832,413
I_kwDOJ0Z1Ps6CpLVd
3,221
How to catch errors using ollama compatibility with OpenAI API
{ "login": "ejgutierrez74", "id": 11474846, "node_id": "MDQ6VXNlcjExNDc0ODQ2", "avatar_url": "https://avatars.githubusercontent.com/u/11474846?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ejgutierrez74", "html_url": "https://github.com/ejgutierrez74", "followers_url": "https://api.github.com/users/ejgutierrez74/followers", "following_url": "https://api.github.com/users/ejgutierrez74/following{/other_user}", "gists_url": "https://api.github.com/users/ejgutierrez74/gists{/gist_id}", "starred_url": "https://api.github.com/users/ejgutierrez74/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ejgutierrez74/subscriptions", "organizations_url": "https://api.github.com/users/ejgutierrez74/orgs", "repos_url": "https://api.github.com/users/ejgutierrez74/repos", "events_url": "https://api.github.com/users/ejgutierrez74/events{/privacy}", "received_events_url": "https://api.github.com/users/ejgutierrez74/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
1
2024-03-18T10:36:23
2024-03-19T17:28:38
2024-03-19T17:28:38
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Hi, im trying this code: ```python def llama_openaiv2(prompt, add_inst=True, #By default True, if you use a base model should write it as False model="llama2", temperature=0.0, #By default in openai is 1.0 o 0.7 depends of the model, openai from 0.0 to 2.0, llama2 from 0.0 to 1.0 max_tokens=1024, verbose=False ): if add_inst: prompt = f"[INST]{prompt}[/INST]" if verbose: print(f"Prompt:\n{prompt}\n") print(f"model: {model}") error = 0 try: response = client.chat.completions.create( messages=[ { 'role': 'user', 'content': prompt, } ], model=model, max_tokens=max_tokens, temperature=temperature ) except openai.APIError as e: #Handle API error here, e.g. retry or log print(f"Llama2: OpenAI API returned an API Error:{e} ") error = 1 pass except openai.APIConnectionError as e: #Handle connection error here print(f"Llama2:Failed to connect to OpenAI API:{e} ") error = 2 pass except openai.RateLimitError as e: #Handle rate limit error (we recommend using exponential backoff) print(f"Llama2: OpenAI API request exceeded rate limit:{e} ") error = 3 pass #Si interesa saber el response entero print(f"response object: {response}") #Solo vamos a devolver la respuesta del sistema return (response.choices[0].message.content, error) ``` So according to the original OpenAI API: https://platform.openai.com/docs/guides/error-codes/python-library-error-types This is an example code: ```python import openai from openai import OpenAI client = OpenAI() try: #Make your OpenAI API request here response = client.completions.create( prompt="Hello world", model="gpt-3.5-turbo-instruct" ) except openai.APIError as e: #Handle API error here, e.g. retry or log print(f"OpenAI API returned an API Error: {e}") pass except openai.APIConnectionError as e: #Handle connection error here print(f"Failed to connect to OpenAI API: {e}") pass except openai.RateLimitError as e: #Handle rate limit error (we recommend using exponential backoff) print(f"OpenAI API request exceeded rate limit: {e}") pass ``` But im facing that running my function, jupyter netbook raises and error, so somehow the errors arent catched properly. For example, if i dont have ollama server running, should give an Connection refused or Connection Error. This error should be catched, but my jupyter netbook seems cant catch it. In concrete i have this error when ollama serve is stopped: ``` --------------------------------------------------------------------------- ConnectionRefusedError Traceback (most recent call last) File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_exceptions.py:10, in map_exceptions(map) 9 try: ---> 10 yield 11 except Exception as exc: # noqa: PIE786 File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_backends/sync.py:206, in SyncBackend.connect_tcp(self, host, port, timeout, local_address, socket_options) 205 with map_exceptions(exc_map): --> 206 sock = socket.create_connection( 207 address, 208 timeout, 209 source_address=source_address, 210 ) 211 for option in socket_options: File /opt/anaconda3/lib/python3.11/socket.py:851, in create_connection(address, timeout, source_address, all_errors) 850 if not all_errors: --> 851 raise exceptions[0] 852 raise ExceptionGroup("create_connection failed", exceptions) File /opt/anaconda3/lib/python3.11/socket.py:836, in create_connection(address, timeout, source_address, all_errors) 835 sock.bind(source_address) --> 836 sock.connect(sa) 837 # Break explicitly a reference cycle ConnectionRefusedError: [Errno 111] Connection refused The above exception was the direct cause of the following exception: ConnectError Traceback (most recent call last) File /opt/anaconda3/lib/python3.11/site-packages/httpx/_transports/default.py:67, in map_httpcore_exceptions() 66 try: ---> 67 yield 68 except Exception as exc: File /opt/anaconda3/lib/python3.11/site-packages/httpx/_transports/default.py:231, in HTTPTransport.handle_request(self, request) 230 with map_httpcore_exceptions(): --> 231 resp = self._pool.handle_request(req) 233 assert isinstance(resp.stream, typing.Iterable) File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py:268, in ConnectionPool.handle_request(self, request) 267 self.response_closed(status) --> 268 raise exc 269 else: File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py:251, in ConnectionPool.handle_request(self, request) 250 try: --> 251 response = connection.handle_request(request) 252 except ConnectionNotAvailable: 253 # The ConnectionNotAvailable exception is a special case, that 254 # indicates we need to retry the request on a new connection. (...) 258 # might end up as an HTTP/2 connection, but which actually ends 259 # up as HTTP/1.1. File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_sync/connection.py:99, in HTTPConnection.handle_request(self, request) 98 self._connect_failed = True ---> 99 raise exc 100 elif not self._connection.is_available(): File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_sync/connection.py:76, in HTTPConnection.handle_request(self, request) 75 try: ---> 76 stream = self._connect(request) 78 ssl_object = stream.get_extra_info("ssl_object") File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_sync/connection.py:124, in HTTPConnection._connect(self, request) 123 with Trace("connect_tcp", logger, request, kwargs) as trace: --> 124 stream = self._network_backend.connect_tcp(**kwargs) 125 trace.return_value = stream File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_backends/sync.py:205, in SyncBackend.connect_tcp(self, host, port, timeout, local_address, socket_options) 200 exc_map: ExceptionMapping = { 201 socket.timeout: ConnectTimeout, 202 OSError: ConnectError, 203 } --> 205 with map_exceptions(exc_map): 206 sock = socket.create_connection( 207 address, 208 timeout, 209 source_address=source_address, 210 ) File /opt/anaconda3/lib/python3.11/contextlib.py:158, in _GeneratorContextManager.__exit__(self, typ, value, traceback) 157 try: --> 158 self.gen.throw(typ, value, traceback) 159 except StopIteration as exc: 160 # Suppress StopIteration *unless* it's the same exception that 161 # was passed to throw(). This prevents a StopIteration 162 # raised inside the "with" statement from being suppressed. File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_exceptions.py:14, in map_exceptions(map) 13 if isinstance(exc, from_exc): ---> 14 raise to_exc(exc) from exc 15 raise ConnectError: [Errno 111] Connection refused The above exception was the direct cause of the following exception: ConnectError Traceback (most recent call last) File /opt/anaconda3/lib/python3.11/site-packages/openai/_base_client.py:897, in SyncAPIClient._request(self, cast_to, options, remaining_retries, stream, stream_cls) 896 try: --> 897 response = self._client.send( 898 request, 899 stream=stream or self._should_stream_response_body(request=request), 900 **kwargs, 901 ) 902 except httpx.TimeoutException as err: File /opt/anaconda3/lib/python3.11/site-packages/httpx/_client.py:915, in Client.send(self, request, stream, auth, follow_redirects) 913 auth = self._build_request_auth(request, auth) --> 915 response = self._send_handling_auth( 916 request, 917 auth=auth, 918 follow_redirects=follow_redirects, 919 history=[], 920 ) 921 try: File /opt/anaconda3/lib/python3.11/site-packages/httpx/_client.py:943, in Client._send_handling_auth(self, request, auth, follow_redirects, history) 942 while True: --> 943 response = self._send_handling_redirects( 944 request, 945 follow_redirects=follow_redirects, 946 history=history, 947 ) 948 try: File /opt/anaconda3/lib/python3.11/site-packages/httpx/_client.py:980, in Client._send_handling_redirects(self, request, follow_redirects, history) 978 hook(request) --> 980 response = self._send_single_request(request) 981 try: File /opt/anaconda3/lib/python3.11/site-packages/httpx/_client.py:1016, in Client._send_single_request(self, request) 1015 with request_context(request=request): -> 1016 response = transport.handle_request(request) 1018 assert isinstance(response.stream, SyncByteStream) File /opt/anaconda3/lib/python3.11/site-packages/httpx/_transports/default.py:230, in HTTPTransport.handle_request(self, request) 218 req = httpcore.Request( 219 method=request.method, 220 url=httpcore.URL( (...) 228 extensions=request.extensions, 229 ) --> 230 with map_httpcore_exceptions(): 231 resp = self._pool.handle_request(req) File /opt/anaconda3/lib/python3.11/contextlib.py:158, in _GeneratorContextManager.__exit__(self, typ, value, traceback) 157 try: --> 158 self.gen.throw(typ, value, traceback) 159 except StopIteration as exc: 160 # Suppress StopIteration *unless* it's the same exception that 161 # was passed to throw(). This prevents a StopIteration 162 # raised inside the "with" statement from being suppressed. File /opt/anaconda3/lib/python3.11/site-packages/httpx/_transports/default.py:84, in map_httpcore_exceptions() 83 message = str(exc) ---> 84 raise mapped_exc(message) from exc ConnectError: [Errno 111] Connection refused During handling of the above exception, another exception occurred: ConnectionRefusedError Traceback (most recent call last) File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_exceptions.py:10, in map_exceptions(map) 9 try: ---> 10 yield 11 except Exception as exc: # noqa: PIE786 File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_backends/sync.py:206, in SyncBackend.connect_tcp(self, host, port, timeout, local_address, socket_options) 205 with map_exceptions(exc_map): --> 206 sock = socket.create_connection( 207 address, 208 timeout, 209 source_address=source_address, 210 ) 211 for option in socket_options: File /opt/anaconda3/lib/python3.11/socket.py:851, in create_connection(address, timeout, source_address, all_errors) 850 if not all_errors: --> 851 raise exceptions[0] 852 raise ExceptionGroup("create_connection failed", exceptions) File /opt/anaconda3/lib/python3.11/socket.py:836, in create_connection(address, timeout, source_address, all_errors) 835 sock.bind(source_address) --> 836 sock.connect(sa) 837 # Break explicitly a reference cycle ConnectionRefusedError: [Errno 111] Connection refused The above exception was the direct cause of the following exception: ConnectError Traceback (most recent call last) File /opt/anaconda3/lib/python3.11/site-packages/httpx/_transports/default.py:67, in map_httpcore_exceptions() 66 try: ---> 67 yield 68 except Exception as exc: File /opt/anaconda3/lib/python3.11/site-packages/httpx/_transports/default.py:231, in HTTPTransport.handle_request(self, request) 230 with map_httpcore_exceptions(): --> 231 resp = self._pool.handle_request(req) 233 assert isinstance(resp.stream, typing.Iterable) File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py:268, in ConnectionPool.handle_request(self, request) 267 self.response_closed(status) --> 268 raise exc 269 else: File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py:251, in ConnectionPool.handle_request(self, request) 250 try: --> 251 response = connection.handle_request(request) 252 except ConnectionNotAvailable: 253 # The ConnectionNotAvailable exception is a special case, that 254 # indicates we need to retry the request on a new connection. (...) 258 # might end up as an HTTP/2 connection, but which actually ends 259 # up as HTTP/1.1. File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_sync/connection.py:99, in HTTPConnection.handle_request(self, request) 98 self._connect_failed = True ---> 99 raise exc 100 elif not self._connection.is_available(): File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_sync/connection.py:76, in HTTPConnection.handle_request(self, request) 75 try: ---> 76 stream = self._connect(request) 78 ssl_object = stream.get_extra_info("ssl_object") File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_sync/connection.py:124, in HTTPConnection._connect(self, request) 123 with Trace("connect_tcp", logger, request, kwargs) as trace: --> 124 stream = self._network_backend.connect_tcp(**kwargs) 125 trace.return_value = stream File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_backends/sync.py:205, in SyncBackend.connect_tcp(self, host, port, timeout, local_address, socket_options) 200 exc_map: ExceptionMapping = { 201 socket.timeout: ConnectTimeout, 202 OSError: ConnectError, 203 } --> 205 with map_exceptions(exc_map): 206 sock = socket.create_connection( 207 address, 208 timeout, 209 source_address=source_address, 210 ) File /opt/anaconda3/lib/python3.11/contextlib.py:158, in _GeneratorContextManager.__exit__(self, typ, value, traceback) 157 try: --> 158 self.gen.throw(typ, value, traceback) 159 except StopIteration as exc: 160 # Suppress StopIteration *unless* it's the same exception that 161 # was passed to throw(). This prevents a StopIteration 162 # raised inside the "with" statement from being suppressed. File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_exceptions.py:14, in map_exceptions(map) 13 if isinstance(exc, from_exc): ---> 14 raise to_exc(exc) from exc 15 raise ConnectError: [Errno 111] Connection refused The above exception was the direct cause of the following exception: ConnectError Traceback (most recent call last) File /opt/anaconda3/lib/python3.11/site-packages/openai/_base_client.py:897, in SyncAPIClient._request(self, cast_to, options, remaining_retries, stream, stream_cls) 896 try: --> 897 response = self._client.send( 898 request, 899 stream=stream or self._should_stream_response_body(request=request), 900 **kwargs, 901 ) 902 except httpx.TimeoutException as err: File /opt/anaconda3/lib/python3.11/site-packages/httpx/_client.py:915, in Client.send(self, request, stream, auth, follow_redirects) 913 auth = self._build_request_auth(request, auth) --> 915 response = self._send_handling_auth( 916 request, 917 auth=auth, 918 follow_redirects=follow_redirects, 919 history=[], 920 ) 921 try: File /opt/anaconda3/lib/python3.11/site-packages/httpx/_client.py:943, in Client._send_handling_auth(self, request, auth, follow_redirects, history) 942 while True: --> 943 response = self._send_handling_redirects( 944 request, 945 follow_redirects=follow_redirects, 946 history=history, 947 ) 948 try: File /opt/anaconda3/lib/python3.11/site-packages/httpx/_client.py:980, in Client._send_handling_redirects(self, request, follow_redirects, history) 978 hook(request) --> 980 response = self._send_single_request(request) 981 try: File /opt/anaconda3/lib/python3.11/site-packages/httpx/_client.py:1016, in Client._send_single_request(self, request) 1015 with request_context(request=request): -> 1016 response = transport.handle_request(request) 1018 assert isinstance(response.stream, SyncByteStream) File /opt/anaconda3/lib/python3.11/site-packages/httpx/_transports/default.py:230, in HTTPTransport.handle_request(self, request) 218 req = httpcore.Request( 219 method=request.method, 220 url=httpcore.URL( (...) 228 extensions=request.extensions, 229 ) --> 230 with map_httpcore_exceptions(): 231 resp = self._pool.handle_request(req) File /opt/anaconda3/lib/python3.11/contextlib.py:158, in _GeneratorContextManager.__exit__(self, typ, value, traceback) 157 try: --> 158 self.gen.throw(typ, value, traceback) 159 except StopIteration as exc: 160 # Suppress StopIteration *unless* it's the same exception that 161 # was passed to throw(). This prevents a StopIteration 162 # raised inside the "with" statement from being suppressed. File /opt/anaconda3/lib/python3.11/site-packages/httpx/_transports/default.py:84, in map_httpcore_exceptions() 83 message = str(exc) ---> 84 raise mapped_exc(message) from exc ConnectError: [Errno 111] Connection refused During handling of the above exception, another exception occurred: ConnectionRefusedError Traceback (most recent call last) File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_exceptions.py:10, in map_exceptions(map) 9 try: ---> 10 yield 11 except Exception as exc: # noqa: PIE786 File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_backends/sync.py:206, in SyncBackend.connect_tcp(self, host, port, timeout, local_address, socket_options) 205 with map_exceptions(exc_map): --> 206 sock = socket.create_connection( 207 address, 208 timeout, 209 source_address=source_address, 210 ) 211 for option in socket_options: File /opt/anaconda3/lib/python3.11/socket.py:851, in create_connection(address, timeout, source_address, all_errors) 850 if not all_errors: --> 851 raise exceptions[0] 852 raise ExceptionGroup("create_connection failed", exceptions) File /opt/anaconda3/lib/python3.11/socket.py:836, in create_connection(address, timeout, source_address, all_errors) 835 sock.bind(source_address) --> 836 sock.connect(sa) 837 # Break explicitly a reference cycle ConnectionRefusedError: [Errno 111] Connection refused The above exception was the direct cause of the following exception: ConnectError Traceback (most recent call last) File /opt/anaconda3/lib/python3.11/site-packages/httpx/_transports/default.py:67, in map_httpcore_exceptions() 66 try: ---> 67 yield 68 except Exception as exc: File /opt/anaconda3/lib/python3.11/site-packages/httpx/_transports/default.py:231, in HTTPTransport.handle_request(self, request) 230 with map_httpcore_exceptions(): --> 231 resp = self._pool.handle_request(req) 233 assert isinstance(resp.stream, typing.Iterable) File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py:268, in ConnectionPool.handle_request(self, request) 267 self.response_closed(status) --> 268 raise exc 269 else: File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py:251, in ConnectionPool.handle_request(self, request) 250 try: --> 251 response = connection.handle_request(request) 252 except ConnectionNotAvailable: 253 # The ConnectionNotAvailable exception is a special case, that 254 # indicates we need to retry the request on a new connection. (...) 258 # might end up as an HTTP/2 connection, but which actually ends 259 # up as HTTP/1.1. File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_sync/connection.py:99, in HTTPConnection.handle_request(self, request) 98 self._connect_failed = True ---> 99 raise exc 100 elif not self._connection.is_available(): File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_sync/connection.py:76, in HTTPConnection.handle_request(self, request) 75 try: ---> 76 stream = self._connect(request) 78 ssl_object = stream.get_extra_info("ssl_object") File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_sync/connection.py:124, in HTTPConnection._connect(self, request) 123 with Trace("connect_tcp", logger, request, kwargs) as trace: --> 124 stream = self._network_backend.connect_tcp(**kwargs) 125 trace.return_value = stream File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_backends/sync.py:205, in SyncBackend.connect_tcp(self, host, port, timeout, local_address, socket_options) 200 exc_map: ExceptionMapping = { 201 socket.timeout: ConnectTimeout, 202 OSError: ConnectError, 203 } --> 205 with map_exceptions(exc_map): 206 sock = socket.create_connection( 207 address, 208 timeout, 209 source_address=source_address, 210 ) File /opt/anaconda3/lib/python3.11/contextlib.py:158, in _GeneratorContextManager.__exit__(self, typ, value, traceback) 157 try: --> 158 self.gen.throw(typ, value, traceback) 159 except StopIteration as exc: 160 # Suppress StopIteration *unless* it's the same exception that 161 # was passed to throw(). This prevents a StopIteration 162 # raised inside the "with" statement from being suppressed. File /opt/anaconda3/lib/python3.11/site-packages/httpcore/_exceptions.py:14, in map_exceptions(map) 13 if isinstance(exc, from_exc): ---> 14 raise to_exc(exc) from exc 15 raise ConnectError: [Errno 111] Connection refused The above exception was the direct cause of the following exception: ConnectError Traceback (most recent call last) File /opt/anaconda3/lib/python3.11/site-packages/openai/_base_client.py:897, in SyncAPIClient._request(self, cast_to, options, remaining_retries, stream, stream_cls) 896 try: --> 897 response = self._client.send( 898 request, 899 stream=stream or self._should_stream_response_body(request=request), 900 **kwargs, 901 ) 902 except httpx.TimeoutException as err: File /opt/anaconda3/lib/python3.11/site-packages/httpx/_client.py:915, in Client.send(self, request, stream, auth, follow_redirects) 913 auth = self._build_request_auth(request, auth) --> 915 response = self._send_handling_auth( 916 request, 917 auth=auth, 918 follow_redirects=follow_redirects, 919 history=[], 920 ) 921 try: File /opt/anaconda3/lib/python3.11/site-packages/httpx/_client.py:943, in Client._send_handling_auth(self, request, auth, follow_redirects, history) 942 while True: --> 943 response = self._send_handling_redirects( 944 request, 945 follow_redirects=follow_redirects, 946 history=history, 947 ) 948 try: File /opt/anaconda3/lib/python3.11/site-packages/httpx/_client.py:980, in Client._send_handling_redirects(self, request, follow_redirects, history) 978 hook(request) --> 980 response = self._send_single_request(request) 981 try: File /opt/anaconda3/lib/python3.11/site-packages/httpx/_client.py:1016, in Client._send_single_request(self, request) 1015 with request_context(request=request): -> 1016 response = transport.handle_request(request) 1018 assert isinstance(response.stream, SyncByteStream) File /opt/anaconda3/lib/python3.11/site-packages/httpx/_transports/default.py:230, in HTTPTransport.handle_request(self, request) 218 req = httpcore.Request( 219 method=request.method, 220 url=httpcore.URL( (...) 228 extensions=request.extensions, 229 ) --> 230 with map_httpcore_exceptions(): 231 resp = self._pool.handle_request(req) File /opt/anaconda3/lib/python3.11/contextlib.py:158, in _GeneratorContextManager.__exit__(self, typ, value, traceback) 157 try: --> 158 self.gen.throw(typ, value, traceback) 159 except StopIteration as exc: 160 # Suppress StopIteration *unless* it's the same exception that 161 # was passed to throw(). This prevents a StopIteration 162 # raised inside the "with" statement from being suppressed. File /opt/anaconda3/lib/python3.11/site-packages/httpx/_transports/default.py:84, in map_httpcore_exceptions() 83 message = str(exc) ---> 84 raise mapped_exc(message) from exc ConnectError: [Errno 111] Connection refused The above exception was the direct cause of the following exception: APIConnectionError Traceback (most recent call last) Cell In[32], line 18, in llama_openaiv2(prompt, add_inst, model, temperature, max_tokens, verbose) 17 try: ---> 18 response = client.chat.completions.create( 19 messages=[ 20 { 21 'role': 'user', 22 'content': prompt, 23 } 24 ], 25 model=model, 26 max_tokens=max_tokens, 27 temperature=temperature 28 ) 30 except openai.APIError as e: 31 #Handle API error here, e.g. retry or log File /opt/anaconda3/lib/python3.11/site-packages/openai/_utils/_utils.py:271, in required_args.<locals>.inner.<locals>.wrapper(*args, **kwargs) 270 raise TypeError(msg) --> 271 return func(*args, **kwargs) File /opt/anaconda3/lib/python3.11/site-packages/openai/resources/chat/completions.py:648, in Completions.create(self, messages, model, frequency_penalty, function_call, functions, logit_bias, logprobs, max_tokens, n, presence_penalty, response_format, seed, stop, stream, temperature, tool_choice, tools, top_logprobs, top_p, user, extra_headers, extra_query, extra_body, timeout) 599 @required_args(["messages", "model"], ["messages", "model", "stream"]) 600 def create( 601 self, (...) 646 timeout: float | httpx.Timeout | None | NotGiven = NOT_GIVEN, 647 ) -> ChatCompletion | Stream[ChatCompletionChunk]: --> 648 return self._post( 649 "/chat/completions", 650 body=maybe_transform( 651 { 652 "messages": messages, 653 "model": model, 654 "frequency_penalty": frequency_penalty, 655 "function_call": function_call, 656 "functions": functions, 657 "logit_bias": logit_bias, 658 "logprobs": logprobs, 659 "max_tokens": max_tokens, 660 "n": n, 661 "presence_penalty": presence_penalty, 662 "response_format": response_format, 663 "seed": seed, 664 "stop": stop, 665 "stream": stream, 666 "temperature": temperature, 667 "tool_choice": tool_choice, 668 "tools": tools, 669 "top_logprobs": top_logprobs, 670 "top_p": top_p, 671 "user": user, 672 }, 673 completion_create_params.CompletionCreateParams, 674 ), 675 options=make_request_options( 676 extra_headers=extra_headers, extra_query=extra_query, extra_body=extra_body, timeout=timeout 677 ), 678 cast_to=ChatCompletion, 679 stream=stream or False, 680 stream_cls=Stream[ChatCompletionChunk], 681 ) File /opt/anaconda3/lib/python3.11/site-packages/openai/_base_client.py:1179, in SyncAPIClient.post(self, path, cast_to, body, options, files, stream, stream_cls) 1176 opts = FinalRequestOptions.construct( 1177 method="post", url=path, json_data=body, files=to_httpx_files(files), **options 1178 ) -> 1179 return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) File /opt/anaconda3/lib/python3.11/site-packages/openai/_base_client.py:868, in SyncAPIClient.request(self, cast_to, options, remaining_retries, stream, stream_cls) 859 def request( 860 self, 861 cast_to: Type[ResponseT], (...) 866 stream_cls: type[_StreamT] | None = None, 867 ) -> ResponseT | _StreamT: --> 868 return self._request( 869 cast_to=cast_to, 870 options=options, 871 stream=stream, 872 stream_cls=stream_cls, 873 remaining_retries=remaining_retries, 874 ) File /opt/anaconda3/lib/python3.11/site-packages/openai/_base_client.py:921, in SyncAPIClient._request(self, cast_to, options, remaining_retries, stream, stream_cls) 920 if retries > 0: --> 921 return self._retry_request( 922 options, 923 cast_to, 924 retries, 925 stream=stream, 926 stream_cls=stream_cls, 927 response_headers=None, 928 ) 930 log.debug("Raising connection error") File /opt/anaconda3/lib/python3.11/site-packages/openai/_base_client.py:992, in SyncAPIClient._retry_request(self, options, cast_to, remaining_retries, response_headers, stream, stream_cls) 990 time.sleep(timeout) --> 992 return self._request( 993 options=options, 994 cast_to=cast_to, 995 remaining_retries=remaining, 996 stream=stream, 997 stream_cls=stream_cls, 998 ) File /opt/anaconda3/lib/python3.11/site-packages/openai/_base_client.py:921, in SyncAPIClient._request(self, cast_to, options, remaining_retries, stream, stream_cls) 920 if retries > 0: --> 921 return self._retry_request( 922 options, 923 cast_to, 924 retries, 925 stream=stream, 926 stream_cls=stream_cls, 927 response_headers=None, 928 ) 930 log.debug("Raising connection error") File /opt/anaconda3/lib/python3.11/site-packages/openai/_base_client.py:992, in SyncAPIClient._retry_request(self, options, cast_to, remaining_retries, response_headers, stream, stream_cls) 990 time.sleep(timeout) --> 992 return self._request( 993 options=options, 994 cast_to=cast_to, 995 remaining_retries=remaining, 996 stream=stream, 997 stream_cls=stream_cls, 998 ) File /opt/anaconda3/lib/python3.11/site-packages/openai/_base_client.py:931, in SyncAPIClient._request(self, cast_to, options, remaining_retries, stream, stream_cls) 930 log.debug("Raising connection error") --> 931 raise APIConnectionError(request=request) from err 933 log.debug( 934 'HTTP Request: %s %s "%i %s"', request.method, request.url, response.status_code, response.reason_phrase 935 ) APIConnectionError: Connection error. During handling of the above exception, another exception occurred: NameError Traceback (most recent call last) Cell In[40], line 18, in llama_chat_openaiv2(promptActual, prompts, responses, model, temperature, max_tokens, verbose) 15 try: ---> 18 response, error = llama_openaiv2(prompt=prompt, 19 add_inst=False, 20 model=model, 21 temperature=temperature, 22 max_tokens=max_tokens, 23 verbose=verbose, 24 ) 25 except ollama.ResponseError as e: Cell In[32], line 30, in llama_openaiv2(prompt, add_inst, model, temperature, max_tokens, verbose) 18 response = client.chat.completions.create( 19 messages=[ 20 { (...) 27 temperature=temperature 28 ) ---> 30 except openai.APIError as e: 31 #Handle API error here, e.g. retry or log 32 print(f"Llama2: OpenAI API returned an API Error:{e} ") NameError: name 'openai' is not defined During handling of the above exception, another exception occurred: NameError Traceback (most recent call last) Cell In[43], line 16 11 responses = [] 13 prompt_1 = """ 14 What are fun activities I can do this weekend? 15 """ ---> 16 response_1 = llama_chat_openaiv2(prompt_1, prompts, responses) Cell In[40], line 25, in llama_chat_openaiv2(promptActual, prompts, responses, model, temperature, max_tokens, verbose) 15 try: 18 response, error = llama_openaiv2(prompt=prompt, 19 add_inst=False, 20 model=model, (...) 23 verbose=verbose, 24 ) ---> 25 except ollama.ResponseError as e: 26 print('Error:', e.content) 27 #if e.status_code == 404: 28 # ollama.pull(model) NameError: name 'ollama' is not defined ``` I also tried: ``` try: response, error = llama_openaiv2(prompt=prompt, add_inst=False, model=model, temperature=temperature, max_tokens=max_tokens, verbose=verbose, ) except ollama.ResponseError as e: print('Error:', e.content) #if e.status_code == 404: # ollama.pull(model) ``` according to ollama documentation...but no luck as you can see above to catch this errors. 1 - So is chat completions using the same try/catch errors ? Or isnt still implemented ? Or perhaps there are other errors/exceptions in ollama ? 2 - Related to the first one_ where can i find information about handling errors in ollama ? Im using python. openai 1.14.1 ollama 1.0.29 Thanks
{ "login": "ejgutierrez74", "id": 11474846, "node_id": "MDQ6VXNlcjExNDc0ODQ2", "avatar_url": "https://avatars.githubusercontent.com/u/11474846?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ejgutierrez74", "html_url": "https://github.com/ejgutierrez74", "followers_url": "https://api.github.com/users/ejgutierrez74/followers", "following_url": "https://api.github.com/users/ejgutierrez74/following{/other_user}", "gists_url": "https://api.github.com/users/ejgutierrez74/gists{/gist_id}", "starred_url": "https://api.github.com/users/ejgutierrez74/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ejgutierrez74/subscriptions", "organizations_url": "https://api.github.com/users/ejgutierrez74/orgs", "repos_url": "https://api.github.com/users/ejgutierrez74/repos", "events_url": "https://api.github.com/users/ejgutierrez74/events{/privacy}", "received_events_url": "https://api.github.com/users/ejgutierrez74/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3221/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3221/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5260
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5260/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5260/comments
https://api.github.com/repos/ollama/ollama/issues/5260/events
https://github.com/ollama/ollama/issues/5260
2,371,200,516
I_kwDOJ0Z1Ps6NVaYE
5,260
Code autopilot
{ "login": "perpendicularai", "id": 146530480, "node_id": "U_kgDOCLvgsA", "avatar_url": "https://avatars.githubusercontent.com/u/146530480?v=4", "gravatar_id": "", "url": "https://api.github.com/users/perpendicularai", "html_url": "https://github.com/perpendicularai", "followers_url": "https://api.github.com/users/perpendicularai/followers", "following_url": "https://api.github.com/users/perpendicularai/following{/other_user}", "gists_url": "https://api.github.com/users/perpendicularai/gists{/gist_id}", "starred_url": "https://api.github.com/users/perpendicularai/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/perpendicularai/subscriptions", "organizations_url": "https://api.github.com/users/perpendicularai/orgs", "repos_url": "https://api.github.com/users/perpendicularai/repos", "events_url": "https://api.github.com/users/perpendicularai/events{/privacy}", "received_events_url": "https://api.github.com/users/perpendicularai/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
null
[]
null
0
2024-06-24T22:14:12
2024-07-02T01:25:17
2024-07-02T01:25:17
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
https://github.com/marketplace/code-autopilot-ai-coder
{ "login": "perpendicularai", "id": 146530480, "node_id": "U_kgDOCLvgsA", "avatar_url": "https://avatars.githubusercontent.com/u/146530480?v=4", "gravatar_id": "", "url": "https://api.github.com/users/perpendicularai", "html_url": "https://github.com/perpendicularai", "followers_url": "https://api.github.com/users/perpendicularai/followers", "following_url": "https://api.github.com/users/perpendicularai/following{/other_user}", "gists_url": "https://api.github.com/users/perpendicularai/gists{/gist_id}", "starred_url": "https://api.github.com/users/perpendicularai/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/perpendicularai/subscriptions", "organizations_url": "https://api.github.com/users/perpendicularai/orgs", "repos_url": "https://api.github.com/users/perpendicularai/repos", "events_url": "https://api.github.com/users/perpendicularai/events{/privacy}", "received_events_url": "https://api.github.com/users/perpendicularai/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5260/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5260/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6064
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6064/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6064/comments
https://api.github.com/repos/ollama/ollama/issues/6064/events
https://github.com/ollama/ollama/pull/6064
2,436,506,166
PR_kwDOJ0Z1Ps52zi4K
6,064
convert: update llama conversion for llama3.1
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-07-29T22:33:17
2024-08-21T19:57:11
2024-08-21T19:57:09
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/6064", "html_url": "https://github.com/ollama/ollama/pull/6064", "diff_url": "https://github.com/ollama/ollama/pull/6064.diff", "patch_url": "https://github.com/ollama/ollama/pull/6064.patch", "merged_at": "2024-08-21T19:57:09" }
llama3.1 contains a new tensor for rope scaling factors. derive this new tensor from llama3.1 configs if they exist
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6064/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 }
https://api.github.com/repos/ollama/ollama/issues/6064/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/1472
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1472/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1472/comments
https://api.github.com/repos/ollama/ollama/issues/1472/events
https://github.com/ollama/ollama/issues/1472
2,036,434,627
I_kwDOJ0Z1Ps55YYbD
1,472
Support for fully airgapped environment
{ "login": "yyefet", "id": 11426837, "node_id": "MDQ6VXNlcjExNDI2ODM3", "avatar_url": "https://avatars.githubusercontent.com/u/11426837?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yyefet", "html_url": "https://github.com/yyefet", "followers_url": "https://api.github.com/users/yyefet/followers", "following_url": "https://api.github.com/users/yyefet/following{/other_user}", "gists_url": "https://api.github.com/users/yyefet/gists{/gist_id}", "starred_url": "https://api.github.com/users/yyefet/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yyefet/subscriptions", "organizations_url": "https://api.github.com/users/yyefet/orgs", "repos_url": "https://api.github.com/users/yyefet/repos", "events_url": "https://api.github.com/users/yyefet/events{/privacy}", "received_events_url": "https://api.github.com/users/yyefet/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
2
2023-12-11T20:04:01
2023-12-11T21:40:09
2023-12-11T21:37:33
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Feature Request to support fully air-gapped environments to ensure no calls/requests leave the server externally if user desires. Proposing --airgap or --no-external flag to disable all telemetry, pull from public repos, and phoning home of any sorts.
{ "login": "mchiang0610", "id": 3325447, "node_id": "MDQ6VXNlcjMzMjU0NDc=", "avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mchiang0610", "html_url": "https://github.com/mchiang0610", "followers_url": "https://api.github.com/users/mchiang0610/followers", "following_url": "https://api.github.com/users/mchiang0610/following{/other_user}", "gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}", "starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions", "organizations_url": "https://api.github.com/users/mchiang0610/orgs", "repos_url": "https://api.github.com/users/mchiang0610/repos", "events_url": "https://api.github.com/users/mchiang0610/events{/privacy}", "received_events_url": "https://api.github.com/users/mchiang0610/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1472/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1472/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5973
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5973/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5973/comments
https://api.github.com/repos/ollama/ollama/issues/5973/events
https://github.com/ollama/ollama/issues/5973
2,431,507,573
I_kwDOJ0Z1Ps6Q7dx1
5,973
Error: template: :28:7: executing "" at <.ToolCalls>: can't evaluate field ToolCalls in type *api.Message
{ "login": "dashan996", "id": 164734277, "node_id": "U_kgDOCdGlRQ", "avatar_url": "https://avatars.githubusercontent.com/u/164734277?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dashan996", "html_url": "https://github.com/dashan996", "followers_url": "https://api.github.com/users/dashan996/followers", "following_url": "https://api.github.com/users/dashan996/following{/other_user}", "gists_url": "https://api.github.com/users/dashan996/gists{/gist_id}", "starred_url": "https://api.github.com/users/dashan996/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dashan996/subscriptions", "organizations_url": "https://api.github.com/users/dashan996/orgs", "repos_url": "https://api.github.com/users/dashan996/repos", "events_url": "https://api.github.com/users/dashan996/events{/privacy}", "received_events_url": "https://api.github.com/users/dashan996/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
12
2024-07-26T06:18:18
2024-07-26T21:24:37
2024-07-26T21:24:37
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? After downloaded llama3.1:70b and run it, I meet this question. I have tried other models and everyone works well. I deleted it and download again, but error still appears again. ### OS macOS ### GPU Apple ### CPU Apple ### Ollama version ollama version is 0.2.2
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5973/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5973/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/8223
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8223/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8223/comments
https://api.github.com/repos/ollama/ollama/issues/8223/events
https://github.com/ollama/ollama/issues/8223
2,756,961,848
I_kwDOJ0Z1Ps6kU-Y4
8,223
Swagger UI implementation for basic testing on the ollama API
{ "login": "jordi-vancuijlenborg-vinci", "id": 45209125, "node_id": "MDQ6VXNlcjQ1MjA5MTI1", "avatar_url": "https://avatars.githubusercontent.com/u/45209125?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jordi-vancuijlenborg-vinci", "html_url": "https://github.com/jordi-vancuijlenborg-vinci", "followers_url": "https://api.github.com/users/jordi-vancuijlenborg-vinci/followers", "following_url": "https://api.github.com/users/jordi-vancuijlenborg-vinci/following{/other_user}", "gists_url": "https://api.github.com/users/jordi-vancuijlenborg-vinci/gists{/gist_id}", "starred_url": "https://api.github.com/users/jordi-vancuijlenborg-vinci/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jordi-vancuijlenborg-vinci/subscriptions", "organizations_url": "https://api.github.com/users/jordi-vancuijlenborg-vinci/orgs", "repos_url": "https://api.github.com/users/jordi-vancuijlenborg-vinci/repos", "events_url": "https://api.github.com/users/jordi-vancuijlenborg-vinci/events{/privacy}", "received_events_url": "https://api.github.com/users/jordi-vancuijlenborg-vinci/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
null
[]
null
2
2024-12-24T01:02:52
2024-12-24T19:22:27
2024-12-24T19:22:27
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
null
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/followers", "following_url": "https://api.github.com/users/pdevine/following{/other_user}", "gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}", "starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pdevine/subscriptions", "organizations_url": "https://api.github.com/users/pdevine/orgs", "repos_url": "https://api.github.com/users/pdevine/repos", "events_url": "https://api.github.com/users/pdevine/events{/privacy}", "received_events_url": "https://api.github.com/users/pdevine/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8223/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8223/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/3664
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3664/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3664/comments
https://api.github.com/repos/ollama/ollama/issues/3664/events
https://github.com/ollama/ollama/pull/3664
2,244,867,811
PR_kwDOJ0Z1Ps5svaO6
3,664
fix padding to only return padding
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-04-16T00:32:11
2024-04-17T22:57:41
2024-04-17T22:57:40
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/3664", "html_url": "https://github.com/ollama/ollama/pull/3664", "diff_url": "https://github.com/ollama/ollama/pull/3664.diff", "patch_url": "https://github.com/ollama/ollama/pull/3664.patch", "merged_at": "2024-04-17T22:57:40" }
follow up to #3663 to simplify padding()
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3664/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3664/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/3108
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3108/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3108/comments
https://api.github.com/repos/ollama/ollama/issues/3108/events
https://github.com/ollama/ollama/issues/3108
2,184,218,049
I_kwDOJ0Z1Ps6CMIXB
3,108
Usability improvement for ollama rm
{ "login": "aosan", "id": 8534160, "node_id": "MDQ6VXNlcjg1MzQxNjA=", "avatar_url": "https://avatars.githubusercontent.com/u/8534160?v=4", "gravatar_id": "", "url": "https://api.github.com/users/aosan", "html_url": "https://github.com/aosan", "followers_url": "https://api.github.com/users/aosan/followers", "following_url": "https://api.github.com/users/aosan/following{/other_user}", "gists_url": "https://api.github.com/users/aosan/gists{/gist_id}", "starred_url": "https://api.github.com/users/aosan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/aosan/subscriptions", "organizations_url": "https://api.github.com/users/aosan/orgs", "repos_url": "https://api.github.com/users/aosan/repos", "events_url": "https://api.github.com/users/aosan/events{/privacy}", "received_events_url": "https://api.github.com/users/aosan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
open
false
null
[]
null
1
2024-03-13T14:54:10
2024-03-23T19:51:28
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
In the current implementation, `ollama rm` removes a model without a prompt for confirmation. Please consider the inclusion of a confirmation prompt, with the customary N/y option to avoid removing models by mistake, with `run` and `rm` being similar.
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3108/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3108/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/8019
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8019/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8019/comments
https://api.github.com/repos/ollama/ollama/issues/8019/events
https://github.com/ollama/ollama/pull/8019
2,728,479,652
PR_kwDOJ0Z1Ps6EnePW
8,019
Delete redundancy code when never happends.
{ "login": "zhanluxianshen", "id": 161462588, "node_id": "U_kgDOCZ-5PA", "avatar_url": "https://avatars.githubusercontent.com/u/161462588?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zhanluxianshen", "html_url": "https://github.com/zhanluxianshen", "followers_url": "https://api.github.com/users/zhanluxianshen/followers", "following_url": "https://api.github.com/users/zhanluxianshen/following{/other_user}", "gists_url": "https://api.github.com/users/zhanluxianshen/gists{/gist_id}", "starred_url": "https://api.github.com/users/zhanluxianshen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zhanluxianshen/subscriptions", "organizations_url": "https://api.github.com/users/zhanluxianshen/orgs", "repos_url": "https://api.github.com/users/zhanluxianshen/repos", "events_url": "https://api.github.com/users/zhanluxianshen/events{/privacy}", "received_events_url": "https://api.github.com/users/zhanluxianshen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
0
2024-12-09T23:06:47
2024-12-11T00:07:18
null
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/8019", "html_url": "https://github.com/ollama/ollama/pull/8019", "diff_url": "https://github.com/ollama/ollama/pull/8019.diff", "patch_url": "https://github.com/ollama/ollama/pull/8019.patch", "merged_at": null }
This code will never happend.. Its dead code. and just waste CPU instructions..
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8019/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8019/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/248
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/248/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/248/comments
https://api.github.com/repos/ollama/ollama/issues/248/events
https://github.com/ollama/ollama/issues/248
1,830,013,491
I_kwDOJ0Z1Ps5tE8oz
248
Modelfile only packages in one license
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
1
2023-07-31T20:55:54
2023-08-02T00:18:16
2023-08-02T00:18:15
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
When two licenses are specified, one gets removed from the packaged modelfile ``` case "license", "template", "system", "prompt": fn(api.ProgressResponse{Status: fmt.Sprintf("creating model %s layer", c.Name)}) // remove the prompt layer if one exists mediaType := fmt.Sprintf("application/vnd.ollama.image.%s", c.Name) layers = removeLayerFromLayers(layers, mediaType) layer, err := CreateLayer(strings.NewReader(c.Args)) if err != nil { return err } layer.MediaType = mediaType layers = append(layers, layer) ```
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/248/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/248/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/1519
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1519/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1519/comments
https://api.github.com/repos/ollama/ollama/issues/1519/events
https://github.com/ollama/ollama/issues/1519
2,041,485,230
I_kwDOJ0Z1Ps55rpeu
1,519
LLM Model Cache files
{ "login": "PrasannaVnewtglobal", "id": 145771576, "node_id": "U_kgDOCLBMOA", "avatar_url": "https://avatars.githubusercontent.com/u/145771576?v=4", "gravatar_id": "", "url": "https://api.github.com/users/PrasannaVnewtglobal", "html_url": "https://github.com/PrasannaVnewtglobal", "followers_url": "https://api.github.com/users/PrasannaVnewtglobal/followers", "following_url": "https://api.github.com/users/PrasannaVnewtglobal/following{/other_user}", "gists_url": "https://api.github.com/users/PrasannaVnewtglobal/gists{/gist_id}", "starred_url": "https://api.github.com/users/PrasannaVnewtglobal/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/PrasannaVnewtglobal/subscriptions", "organizations_url": "https://api.github.com/users/PrasannaVnewtglobal/orgs", "repos_url": "https://api.github.com/users/PrasannaVnewtglobal/repos", "events_url": "https://api.github.com/users/PrasannaVnewtglobal/events{/privacy}", "received_events_url": "https://api.github.com/users/PrasannaVnewtglobal/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
5
2023-12-14T11:17:35
2024-05-10T04:36:24
2024-05-10T00:55:57
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Ollama store the LLM model in the modelfile "List", When I try to run the model in the first SSH session it giving the good results and store some caches, but when i try to open new session it not utilizing the previous response cache, where the cache file is present for the LLM model, i couldn't find the cache file. what is the possible way to achieve the consistency results, same time i can't find the configuration file for the LLM model. give the update for this issue.
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1519/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1519/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/359
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/359/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/359/comments
https://api.github.com/repos/ollama/ollama/issues/359/events
https://github.com/ollama/ollama/issues/359
1,853,012,052
I_kwDOJ0Z1Ps5ucrhU
359
Where are the Modelfiles?
{ "login": "khromov", "id": 1207507, "node_id": "MDQ6VXNlcjEyMDc1MDc=", "avatar_url": "https://avatars.githubusercontent.com/u/1207507?v=4", "gravatar_id": "", "url": "https://api.github.com/users/khromov", "html_url": "https://github.com/khromov", "followers_url": "https://api.github.com/users/khromov/followers", "following_url": "https://api.github.com/users/khromov/following{/other_user}", "gists_url": "https://api.github.com/users/khromov/gists{/gist_id}", "starred_url": "https://api.github.com/users/khromov/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/khromov/subscriptions", "organizations_url": "https://api.github.com/users/khromov/orgs", "repos_url": "https://api.github.com/users/khromov/repos", "events_url": "https://api.github.com/users/khromov/events{/privacy}", "received_events_url": "https://api.github.com/users/khromov/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396191, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aXw", "url": "https://api.github.com/repos/ollama/ollama/labels/documentation", "name": "documentation", "color": "0075ca", "default": true, "description": "Improvements or additions to documentation" } ]
closed
false
null
[]
null
3
2023-08-16T11:07:07
2023-08-22T01:02:58
2023-08-22T01:02:58
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
At some point the model files seem to have been located in the repo, such as this result which shows up on Google but now they are gone. Where can we find them? https://github.com/jmorganca/ollama/blob/main/library/modelfiles/llama2
{ "login": "mchiang0610", "id": 3325447, "node_id": "MDQ6VXNlcjMzMjU0NDc=", "avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mchiang0610", "html_url": "https://github.com/mchiang0610", "followers_url": "https://api.github.com/users/mchiang0610/followers", "following_url": "https://api.github.com/users/mchiang0610/following{/other_user}", "gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}", "starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions", "organizations_url": "https://api.github.com/users/mchiang0610/orgs", "repos_url": "https://api.github.com/users/mchiang0610/repos", "events_url": "https://api.github.com/users/mchiang0610/events{/privacy}", "received_events_url": "https://api.github.com/users/mchiang0610/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/359/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/359/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2241
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2241/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2241/comments
https://api.github.com/repos/ollama/ollama/issues/2241/events
https://github.com/ollama/ollama/pull/2241
2,104,343,768
PR_kwDOJ0Z1Ps5lQ8gH
2,241
Do not repeat system prompt for chat templating
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
1
2024-01-28T20:53:47
2024-01-28T22:15:57
2024-01-28T22:15:57
MEMBER
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/2241", "html_url": "https://github.com/ollama/ollama/pull/2241", "diff_url": "https://github.com/ollama/ollama/pull/2241.diff", "patch_url": "https://github.com/ollama/ollama/pull/2241.patch", "merged_at": "2024-01-28T22:15:57" }
Before: ``` <|im_start|>system You are a happy dog<|im_end|> <|im_start|>assistant hi im a friendly assistant<|im_end|> <|im_start|>system You are a happy dog<|im_end|> <|im_start|>user who are you?<|im_end|> ``` After: ``` <|im_start|>system You are a happy dog<|im_end|> <|im_start|>assistant hi im a friendly assistant<|im_end|> <|im_start|>user who are you?<|im_end|> ```
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2241/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2241/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/7949
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7949/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7949/comments
https://api.github.com/repos/ollama/ollama/issues/7949/events
https://github.com/ollama/ollama/issues/7949
2,719,919,741
I_kwDOJ0Z1Ps6iHq59
7,949
panic: failed to decode batch: could not find a kv cache slot goroutine 22 [running]: main.(*Server).run(0xc0000c2120, {0x556536b63ba0, 0xc00008a0a0}) github.com/ollama/ollama/llama/runner/runner.go:344 +0x23e created by main.main in goroutine 1 github.com/ollama/ollama/llama/runner/runner.go:978 +0xcc7
{ "login": "watch-Ultra", "id": 177522180, "node_id": "U_kgDOCpTGBA", "avatar_url": "https://avatars.githubusercontent.com/u/177522180?v=4", "gravatar_id": "", "url": "https://api.github.com/users/watch-Ultra", "html_url": "https://github.com/watch-Ultra", "followers_url": "https://api.github.com/users/watch-Ultra/followers", "following_url": "https://api.github.com/users/watch-Ultra/following{/other_user}", "gists_url": "https://api.github.com/users/watch-Ultra/gists{/gist_id}", "starred_url": "https://api.github.com/users/watch-Ultra/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/watch-Ultra/subscriptions", "organizations_url": "https://api.github.com/users/watch-Ultra/orgs", "repos_url": "https://api.github.com/users/watch-Ultra/repos", "events_url": "https://api.github.com/users/watch-Ultra/events{/privacy}", "received_events_url": "https://api.github.com/users/watch-Ultra/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
9
2024-12-05T09:55:40
2025-01-04T05:00:33
2024-12-17T22:01:20
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
null
{ "login": "jessegross", "id": 6468499, "node_id": "MDQ6VXNlcjY0Njg0OTk=", "avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jessegross", "html_url": "https://github.com/jessegross", "followers_url": "https://api.github.com/users/jessegross/followers", "following_url": "https://api.github.com/users/jessegross/following{/other_user}", "gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}", "starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jessegross/subscriptions", "organizations_url": "https://api.github.com/users/jessegross/orgs", "repos_url": "https://api.github.com/users/jessegross/repos", "events_url": "https://api.github.com/users/jessegross/events{/privacy}", "received_events_url": "https://api.github.com/users/jessegross/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7949/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7949/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/7601
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7601/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7601/comments
https://api.github.com/repos/ollama/ollama/issues/7601/events
https://github.com/ollama/ollama/issues/7601
2,647,474,412
I_kwDOJ0Z1Ps6dzUDs
7,601
Updating OpenCoder model information in library
{ "login": "elsatch", "id": 653433, "node_id": "MDQ6VXNlcjY1MzQzMw==", "avatar_url": "https://avatars.githubusercontent.com/u/653433?v=4", "gravatar_id": "", "url": "https://api.github.com/users/elsatch", "html_url": "https://github.com/elsatch", "followers_url": "https://api.github.com/users/elsatch/followers", "following_url": "https://api.github.com/users/elsatch/following{/other_user}", "gists_url": "https://api.github.com/users/elsatch/gists{/gist_id}", "starred_url": "https://api.github.com/users/elsatch/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/elsatch/subscriptions", "organizations_url": "https://api.github.com/users/elsatch/orgs", "repos_url": "https://api.github.com/users/elsatch/repos", "events_url": "https://api.github.com/users/elsatch/events{/privacy}", "received_events_url": "https://api.github.com/users/elsatch/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
1
2024-11-10T17:46:11
2024-11-10T21:21:06
2024-11-10T21:21:05
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? I have been trying to use the new OpenCoder model with Continue extension and discovered it doesn't work. Asking at Continue's discord someone pointed me to this thread at HF: https://huggingface.co/infly/OpenCoder-8B-Instruct/discussions/2#67304d50de572a8535e3d20b In that thread you can find the following information: "Hi! Unfortunately OpenCoder doesn't support the FIM task. Our model added this token due to certain historical considerations." Ollama includes the following description for OpenCoder models: OpenCoder is an open and reproducible code LLM family which includes 1.5B and 8B models, supporting both chat and fill-in-the-middle for English and Chinese languages. There is also one example at the bottom of the page for Fill in the Middle task, which the model doesn't seem to support. HF model card doesn't include any information about fill in the middle: OpenCoder is an open and reproducible code LLM family which includes 1.5B and 8B base and chat models, supporting both English and Chinese languages. Note that in the original model card the model includes base and chat model, in Llama library it's chat and fill in the middle. Maybe the model card was updated afterwards. Could you update the ollama library information to avoid any confusion for final users? ### OS macOS ### GPU Apple ### CPU Apple ### Ollama version _No response_
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7601/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7601/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/1238
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1238/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1238/comments
https://api.github.com/repos/ollama/ollama/issues/1238/events
https://github.com/ollama/ollama/issues/1238
2,006,254,412
I_kwDOJ0Z1Ps53lQNM
1,238
Feature request: Chat logs auto-save by default
{ "login": "bitcoinmeetups", "id": 2834754, "node_id": "MDQ6VXNlcjI4MzQ3NTQ=", "avatar_url": "https://avatars.githubusercontent.com/u/2834754?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bitcoinmeetups", "html_url": "https://github.com/bitcoinmeetups", "followers_url": "https://api.github.com/users/bitcoinmeetups/followers", "following_url": "https://api.github.com/users/bitcoinmeetups/following{/other_user}", "gists_url": "https://api.github.com/users/bitcoinmeetups/gists{/gist_id}", "starred_url": "https://api.github.com/users/bitcoinmeetups/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/bitcoinmeetups/subscriptions", "organizations_url": "https://api.github.com/users/bitcoinmeetups/orgs", "repos_url": "https://api.github.com/users/bitcoinmeetups/repos", "events_url": "https://api.github.com/users/bitcoinmeetups/events{/privacy}", "received_events_url": "https://api.github.com/users/bitcoinmeetups/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
null
[]
null
2
2023-11-22T12:38:26
2024-03-12T20:16:24
2024-03-12T20:16:24
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Hi, I'm Mr. Cascade, a very friendly guy. I would like to make the following feature request: Chats automatically being saved by default. This is also the expected behaviour. Similar software like terminal gpt has it enabled by default. I'm on a server where I can't scroll up on Tmux so while testing I lost interesting content several times just because I couldn't scroll up to retrieve it. Mockup: All chat logs goes automatically to /etc/ollama_chats or something like that Otherwise - love your product. I test a lot of packages and this one is outstanding.
{ "login": "mchiang0610", "id": 3325447, "node_id": "MDQ6VXNlcjMzMjU0NDc=", "avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mchiang0610", "html_url": "https://github.com/mchiang0610", "followers_url": "https://api.github.com/users/mchiang0610/followers", "following_url": "https://api.github.com/users/mchiang0610/following{/other_user}", "gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}", "starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions", "organizations_url": "https://api.github.com/users/mchiang0610/orgs", "repos_url": "https://api.github.com/users/mchiang0610/repos", "events_url": "https://api.github.com/users/mchiang0610/events{/privacy}", "received_events_url": "https://api.github.com/users/mchiang0610/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1238/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1238/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/1991
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1991/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1991/comments
https://api.github.com/repos/ollama/ollama/issues/1991/events
https://github.com/ollama/ollama/issues/1991
2,080,854,566
I_kwDOJ0Z1Ps58B1Im
1,991
Error: Post "http://127.0.0.1:11434/api/generate": EOF
{ "login": "joesalvati68", "id": 59943835, "node_id": "MDQ6VXNlcjU5OTQzODM1", "avatar_url": "https://avatars.githubusercontent.com/u/59943835?v=4", "gravatar_id": "", "url": "https://api.github.com/users/joesalvati68", "html_url": "https://github.com/joesalvati68", "followers_url": "https://api.github.com/users/joesalvati68/followers", "following_url": "https://api.github.com/users/joesalvati68/following{/other_user}", "gists_url": "https://api.github.com/users/joesalvati68/gists{/gist_id}", "starred_url": "https://api.github.com/users/joesalvati68/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/joesalvati68/subscriptions", "organizations_url": "https://api.github.com/users/joesalvati68/orgs", "repos_url": "https://api.github.com/users/joesalvati68/repos", "events_url": "https://api.github.com/users/joesalvati68/events{/privacy}", "received_events_url": "https://api.github.com/users/joesalvati68/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
21
2024-01-14T18:54:14
2024-02-11T10:12:26
2024-01-28T20:00:21
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
(base) user@userAlienware:~$ ollama run vicuna Error: Post "http://127.0.0.1:11434/api/generate": EOF (base) user@userAlienware:~$ I keep getting this after initial install and I can't figure out why. Any ideas?
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1991/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1991/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/3270
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3270/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3270/comments
https://api.github.com/repos/ollama/ollama/issues/3270/events
https://github.com/ollama/ollama/pull/3270
2,197,461,029
PR_kwDOJ0Z1Ps5qN8aS
3,270
refactor(cmd): distribute commands into root.go, create.go, run.go, etc.
{ "login": "igophper", "id": 34326532, "node_id": "MDQ6VXNlcjM0MzI2NTMy", "avatar_url": "https://avatars.githubusercontent.com/u/34326532?v=4", "gravatar_id": "", "url": "https://api.github.com/users/igophper", "html_url": "https://github.com/igophper", "followers_url": "https://api.github.com/users/igophper/followers", "following_url": "https://api.github.com/users/igophper/following{/other_user}", "gists_url": "https://api.github.com/users/igophper/gists{/gist_id}", "starred_url": "https://api.github.com/users/igophper/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/igophper/subscriptions", "organizations_url": "https://api.github.com/users/igophper/orgs", "repos_url": "https://api.github.com/users/igophper/repos", "events_url": "https://api.github.com/users/igophper/events{/privacy}", "received_events_url": "https://api.github.com/users/igophper/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
1
2024-03-20T12:51:09
2024-04-07T09:57:30
2024-04-07T06:37:04
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/3270", "html_url": "https://github.com/ollama/ollama/pull/3270", "diff_url": "https://github.com/ollama/ollama/pull/3270.diff", "patch_url": "https://github.com/ollama/ollama/pull/3270.patch", "merged_at": null }
Previously, all commands were located in a single cmd.go file. This refactor improves the organization and readability of the code by distributing commands into their respective files such as root.go, create.go, run.go, etc.
{ "login": "igophper", "id": 34326532, "node_id": "MDQ6VXNlcjM0MzI2NTMy", "avatar_url": "https://avatars.githubusercontent.com/u/34326532?v=4", "gravatar_id": "", "url": "https://api.github.com/users/igophper", "html_url": "https://github.com/igophper", "followers_url": "https://api.github.com/users/igophper/followers", "following_url": "https://api.github.com/users/igophper/following{/other_user}", "gists_url": "https://api.github.com/users/igophper/gists{/gist_id}", "starred_url": "https://api.github.com/users/igophper/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/igophper/subscriptions", "organizations_url": "https://api.github.com/users/igophper/orgs", "repos_url": "https://api.github.com/users/igophper/repos", "events_url": "https://api.github.com/users/igophper/events{/privacy}", "received_events_url": "https://api.github.com/users/igophper/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3270/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3270/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/2646
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2646/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2646/comments
https://api.github.com/repos/ollama/ollama/issues/2646/events
https://github.com/ollama/ollama/issues/2646
2,147,447,221
I_kwDOJ0Z1Ps5__3G1
2,646
Defect: EOF on running with Gemma:7b
{ "login": "kvchitrapu", "id": 44282098, "node_id": "MDQ6VXNlcjQ0MjgyMDk4", "avatar_url": "https://avatars.githubusercontent.com/u/44282098?v=4", "gravatar_id": "", "url": "https://api.github.com/users/kvchitrapu", "html_url": "https://github.com/kvchitrapu", "followers_url": "https://api.github.com/users/kvchitrapu/followers", "following_url": "https://api.github.com/users/kvchitrapu/following{/other_user}", "gists_url": "https://api.github.com/users/kvchitrapu/gists{/gist_id}", "starred_url": "https://api.github.com/users/kvchitrapu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kvchitrapu/subscriptions", "organizations_url": "https://api.github.com/users/kvchitrapu/orgs", "repos_url": "https://api.github.com/users/kvchitrapu/repos", "events_url": "https://api.github.com/users/kvchitrapu/events{/privacy}", "received_events_url": "https://api.github.com/users/kvchitrapu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
2
2024-02-21T18:39:05
2024-02-21T18:49:38
2024-02-21T18:49:38
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
OS: Mac M1 Pro ``` $ ollama run gemma:7b pulling manifest pulling 2c5f288be750... 100% ▕████████████████████████████████████████████▏ 4.8 GB pulling 097a36493f71... 100% ▕████████████████████████████████████████████▏ 8.4 KB pulling 109037bec39c... 100% ▕████████████████████████████████████████████▏ 136 B pulling 2490e7468436... 100% ▕████████████████████████████████████████████▏ 65 B pulling b5da6a03f7b9... 100% ▕████████████████████████████████████████████▏ 483 B verifying sha256 digest writing manifest removing any unused layers success Error: Post "[http://127.0.0.1:11434/api/chat":](http://127.0.0.1:11434/api/chat%22:) EOF ```
{ "login": "kvchitrapu", "id": 44282098, "node_id": "MDQ6VXNlcjQ0MjgyMDk4", "avatar_url": "https://avatars.githubusercontent.com/u/44282098?v=4", "gravatar_id": "", "url": "https://api.github.com/users/kvchitrapu", "html_url": "https://github.com/kvchitrapu", "followers_url": "https://api.github.com/users/kvchitrapu/followers", "following_url": "https://api.github.com/users/kvchitrapu/following{/other_user}", "gists_url": "https://api.github.com/users/kvchitrapu/gists{/gist_id}", "starred_url": "https://api.github.com/users/kvchitrapu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kvchitrapu/subscriptions", "organizations_url": "https://api.github.com/users/kvchitrapu/orgs", "repos_url": "https://api.github.com/users/kvchitrapu/repos", "events_url": "https://api.github.com/users/kvchitrapu/events{/privacy}", "received_events_url": "https://api.github.com/users/kvchitrapu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2646/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2646/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2498
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2498/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2498/comments
https://api.github.com/repos/ollama/ollama/issues/2498/events
https://github.com/ollama/ollama/pull/2498
2,134,944,637
PR_kwDOJ0Z1Ps5m5FV_
2,498
Windows Preview
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-02-14T18:35:49
2024-02-14T18:40:05
2024-02-14T18:40:04
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/2498", "html_url": "https://github.com/ollama/ollama/pull/2498", "diff_url": "https://github.com/ollama/ollama/pull/2498.diff", "patch_url": "https://github.com/ollama/ollama/pull/2498.patch", "merged_at": null }
Copy of #2481 using branch
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2498/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2498/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/820
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/820/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/820/comments
https://api.github.com/repos/ollama/ollama/issues/820/events
https://github.com/ollama/ollama/issues/820
1,947,728,944
I_kwDOJ0Z1Ps50F_ww
820
interactive mode with prompt as argument
{ "login": "jonas-w", "id": 32615971, "node_id": "MDQ6VXNlcjMyNjE1OTcx", "avatar_url": "https://avatars.githubusercontent.com/u/32615971?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jonas-w", "html_url": "https://github.com/jonas-w", "followers_url": "https://api.github.com/users/jonas-w/followers", "following_url": "https://api.github.com/users/jonas-w/following{/other_user}", "gists_url": "https://api.github.com/users/jonas-w/gists{/gist_id}", "starred_url": "https://api.github.com/users/jonas-w/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jonas-w/subscriptions", "organizations_url": "https://api.github.com/users/jonas-w/orgs", "repos_url": "https://api.github.com/users/jonas-w/repos", "events_url": "https://api.github.com/users/jonas-w/events{/privacy}", "received_events_url": "https://api.github.com/users/jonas-w/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
2
2023-10-17T15:41:54
2023-10-17T21:54:36
2023-10-17T20:34:07
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
When passing a prompt as an argument e.g. `ollama run $MODEL "Hello World!"`, ollama will exit after this prompt and don't wait for further input, like it is with `ollama run $MODEL`. Would it be possible to add an `-i` `--interactive` flag when passing the prompt directly?
{ "login": "mchiang0610", "id": 3325447, "node_id": "MDQ6VXNlcjMzMjU0NDc=", "avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mchiang0610", "html_url": "https://github.com/mchiang0610", "followers_url": "https://api.github.com/users/mchiang0610/followers", "following_url": "https://api.github.com/users/mchiang0610/following{/other_user}", "gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}", "starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions", "organizations_url": "https://api.github.com/users/mchiang0610/orgs", "repos_url": "https://api.github.com/users/mchiang0610/repos", "events_url": "https://api.github.com/users/mchiang0610/events{/privacy}", "received_events_url": "https://api.github.com/users/mchiang0610/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/820/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/820/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2488
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2488/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2488/comments
https://api.github.com/repos/ollama/ollama/issues/2488/events
https://github.com/ollama/ollama/issues/2488
2,133,781,415
I_kwDOJ0Z1Ps5_Luun
2,488
How can fine tune with ollama?
{ "login": "KDH-Korea", "id": 126445656, "node_id": "U_kgDOB4loWA", "avatar_url": "https://avatars.githubusercontent.com/u/126445656?v=4", "gravatar_id": "", "url": "https://api.github.com/users/KDH-Korea", "html_url": "https://github.com/KDH-Korea", "followers_url": "https://api.github.com/users/KDH-Korea/followers", "following_url": "https://api.github.com/users/KDH-Korea/following{/other_user}", "gists_url": "https://api.github.com/users/KDH-Korea/gists{/gist_id}", "starred_url": "https://api.github.com/users/KDH-Korea/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/KDH-Korea/subscriptions", "organizations_url": "https://api.github.com/users/KDH-Korea/orgs", "repos_url": "https://api.github.com/users/KDH-Korea/repos", "events_url": "https://api.github.com/users/KDH-Korea/events{/privacy}", "received_events_url": "https://api.github.com/users/KDH-Korea/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
3
2024-02-14T08:12:19
2024-02-20T22:52:07
2024-02-20T22:52:07
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I want to fine-tune the Mistral model imported using Ollama, but there is no information available, and it's even more challenging to find information in Korea where not many people are familiar with Ollama. I would appreciate it if you could provide information on how to fine-tune the model using Ollama.
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2488/reactions", "total_count": 4, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 4 }
https://api.github.com/repos/ollama/ollama/issues/2488/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2002
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2002/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2002/comments
https://api.github.com/repos/ollama/ollama/issues/2002/events
https://github.com/ollama/ollama/issues/2002
2,082,071,134
I_kwDOJ0Z1Ps58GeJe
2,002
how to enable amd gpu for ollama ?
{ "login": "hemangjoshi37a", "id": 12392345, "node_id": "MDQ6VXNlcjEyMzkyMzQ1", "avatar_url": "https://avatars.githubusercontent.com/u/12392345?v=4", "gravatar_id": "", "url": "https://api.github.com/users/hemangjoshi37a", "html_url": "https://github.com/hemangjoshi37a", "followers_url": "https://api.github.com/users/hemangjoshi37a/followers", "following_url": "https://api.github.com/users/hemangjoshi37a/following{/other_user}", "gists_url": "https://api.github.com/users/hemangjoshi37a/gists{/gist_id}", "starred_url": "https://api.github.com/users/hemangjoshi37a/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/hemangjoshi37a/subscriptions", "organizations_url": "https://api.github.com/users/hemangjoshi37a/orgs", "repos_url": "https://api.github.com/users/hemangjoshi37a/repos", "events_url": "https://api.github.com/users/hemangjoshi37a/events{/privacy}", "received_events_url": "https://api.github.com/users/hemangjoshi37a/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
2
2024-01-15T13:47:33
2024-01-15T14:04:51
2024-01-15T13:56:06
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
how to enable amd gpu for ollama ?
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2002/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2002/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/7084
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7084/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7084/comments
https://api.github.com/repos/ollama/ollama/issues/7084/events
https://github.com/ollama/ollama/pull/7084
2,562,818,040
PR_kwDOJ0Z1Ps59b7DO
7,084
Adding: OrionChat: A Web Interface for Seamless AI Conversation
{ "login": "EliasPereirah", "id": 16616409, "node_id": "MDQ6VXNlcjE2NjE2NDA5", "avatar_url": "https://avatars.githubusercontent.com/u/16616409?v=4", "gravatar_id": "", "url": "https://api.github.com/users/EliasPereirah", "html_url": "https://github.com/EliasPereirah", "followers_url": "https://api.github.com/users/EliasPereirah/followers", "following_url": "https://api.github.com/users/EliasPereirah/following{/other_user}", "gists_url": "https://api.github.com/users/EliasPereirah/gists{/gist_id}", "starred_url": "https://api.github.com/users/EliasPereirah/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/EliasPereirah/subscriptions", "organizations_url": "https://api.github.com/users/EliasPereirah/orgs", "repos_url": "https://api.github.com/users/EliasPereirah/repos", "events_url": "https://api.github.com/users/EliasPereirah/events{/privacy}", "received_events_url": "https://api.github.com/users/EliasPereirah/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
1
2024-10-02T23:32:12
2024-11-21T19:23:42
2024-11-21T19:23:42
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/7084", "html_url": "https://github.com/ollama/ollama/pull/7084", "diff_url": "https://github.com/ollama/ollama/pull/7084.diff", "patch_url": "https://github.com/ollama/ollama/pull/7084.patch", "merged_at": "2024-11-21T19:23:42" }
OrionChat is a free web-based chat interface that simplifies interactions with multiple AI model providers. It provides a unified platform for chatting and exploring multiple large language models (LLMs), including: **Ollama**, OpenAI Google Gemini Claude (Anthropic) Groq Inc. Cerebras
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7084/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7084/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/7050
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7050/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7050/comments
https://api.github.com/repos/ollama/ollama/issues/7050/events
https://github.com/ollama/ollama/pull/7050
2,557,223,671
PR_kwDOJ0Z1Ps59Jgfb
7,050
Stop model before deletion if loaded (fixed #6957)
{ "login": "alexmavr", "id": 680441, "node_id": "MDQ6VXNlcjY4MDQ0MQ==", "avatar_url": "https://avatars.githubusercontent.com/u/680441?v=4", "gravatar_id": "", "url": "https://api.github.com/users/alexmavr", "html_url": "https://github.com/alexmavr", "followers_url": "https://api.github.com/users/alexmavr/followers", "following_url": "https://api.github.com/users/alexmavr/following{/other_user}", "gists_url": "https://api.github.com/users/alexmavr/gists{/gist_id}", "starred_url": "https://api.github.com/users/alexmavr/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/alexmavr/subscriptions", "organizations_url": "https://api.github.com/users/alexmavr/orgs", "repos_url": "https://api.github.com/users/alexmavr/repos", "events_url": "https://api.github.com/users/alexmavr/events{/privacy}", "received_events_url": "https://api.github.com/users/alexmavr/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-09-30T17:09:35
2024-10-01T22:46:21
2024-10-01T22:45:43
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/7050", "html_url": "https://github.com/ollama/ollama/pull/7050", "diff_url": "https://github.com/ollama/ollama/pull/7050.diff", "patch_url": "https://github.com/ollama/ollama/pull/7050.patch", "merged_at": "2024-10-01T22:45:43" }
This PR adds the same logic as `StopHandler` before model removal. If the model is not loaded, no error is raised. I also added a few tests on the server DeleteHandler, and fixed an error formatting bug where `ollama rm` would return a "file not found" error for the manifest file instead of a proper error message if the model does not exist Fixes #6957 Tested with: ``` $ ./ollama run localhost/phi/phi:latest $ ./ollama ps NAME ID SIZE PROCESSOR UNTIL localhost/phi/phi:latest e2fd6321a5fe 5.5 GB 100% GPU 4 minutes from now $ ./ollama rm localhost/phi/phi:latest deleted 'localhost/phi/phi:latest' $ ./ollama ps NAME ID SIZE PROCESSOR UNTIL ```
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/followers", "following_url": "https://api.github.com/users/pdevine/following{/other_user}", "gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}", "starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pdevine/subscriptions", "organizations_url": "https://api.github.com/users/pdevine/orgs", "repos_url": "https://api.github.com/users/pdevine/repos", "events_url": "https://api.github.com/users/pdevine/events{/privacy}", "received_events_url": "https://api.github.com/users/pdevine/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7050/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7050/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/4017
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4017/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4017/comments
https://api.github.com/repos/ollama/ollama/issues/4017/events
https://github.com/ollama/ollama/issues/4017
2,267,998,441
I_kwDOJ0Z1Ps6HLujp
4,017
Hardware configuration: 64-core CPU, 1TB memory, use of llama3:8b is slow. why?
{ "login": "zhaohuaxi-Shi", "id": 58802558, "node_id": "MDQ6VXNlcjU4ODAyNTU4", "avatar_url": "https://avatars.githubusercontent.com/u/58802558?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zhaohuaxi-Shi", "html_url": "https://github.com/zhaohuaxi-Shi", "followers_url": "https://api.github.com/users/zhaohuaxi-Shi/followers", "following_url": "https://api.github.com/users/zhaohuaxi-Shi/following{/other_user}", "gists_url": "https://api.github.com/users/zhaohuaxi-Shi/gists{/gist_id}", "starred_url": "https://api.github.com/users/zhaohuaxi-Shi/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zhaohuaxi-Shi/subscriptions", "organizations_url": "https://api.github.com/users/zhaohuaxi-Shi/orgs", "repos_url": "https://api.github.com/users/zhaohuaxi-Shi/repos", "events_url": "https://api.github.com/users/zhaohuaxi-Shi/events{/privacy}", "received_events_url": "https://api.github.com/users/zhaohuaxi-Shi/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
2
2024-04-29T02:20:58
2024-04-29T09:20:11
2024-04-29T09:20:11
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
user api/generate pre-load model , its useless
{ "login": "zhaohuaxi-Shi", "id": 58802558, "node_id": "MDQ6VXNlcjU4ODAyNTU4", "avatar_url": "https://avatars.githubusercontent.com/u/58802558?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zhaohuaxi-Shi", "html_url": "https://github.com/zhaohuaxi-Shi", "followers_url": "https://api.github.com/users/zhaohuaxi-Shi/followers", "following_url": "https://api.github.com/users/zhaohuaxi-Shi/following{/other_user}", "gists_url": "https://api.github.com/users/zhaohuaxi-Shi/gists{/gist_id}", "starred_url": "https://api.github.com/users/zhaohuaxi-Shi/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zhaohuaxi-Shi/subscriptions", "organizations_url": "https://api.github.com/users/zhaohuaxi-Shi/orgs", "repos_url": "https://api.github.com/users/zhaohuaxi-Shi/repos", "events_url": "https://api.github.com/users/zhaohuaxi-Shi/events{/privacy}", "received_events_url": "https://api.github.com/users/zhaohuaxi-Shi/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4017/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4017/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5632
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5632/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5632/comments
https://api.github.com/repos/ollama/ollama/issues/5632/events
https://github.com/ollama/ollama/pull/5632
2,403,447,460
PR_kwDOJ0Z1Ps51HER6
5,632
cmd: better version info when client/server not equal
{ "login": "alwqx", "id": 9915368, "node_id": "MDQ6VXNlcjk5MTUzNjg=", "avatar_url": "https://avatars.githubusercontent.com/u/9915368?v=4", "gravatar_id": "", "url": "https://api.github.com/users/alwqx", "html_url": "https://github.com/alwqx", "followers_url": "https://api.github.com/users/alwqx/followers", "following_url": "https://api.github.com/users/alwqx/following{/other_user}", "gists_url": "https://api.github.com/users/alwqx/gists{/gist_id}", "starred_url": "https://api.github.com/users/alwqx/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/alwqx/subscriptions", "organizations_url": "https://api.github.com/users/alwqx/orgs", "repos_url": "https://api.github.com/users/alwqx/repos", "events_url": "https://api.github.com/users/alwqx/events{/privacy}", "received_events_url": "https://api.github.com/users/alwqx/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
1
2024-07-11T15:12:44
2024-08-07T03:40:26
2024-08-07T03:40:26
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/5632", "html_url": "https://github.com/ollama/ollama/pull/5632", "diff_url": "https://github.com/ollama/ollama/pull/5632.diff", "patch_url": "https://github.com/ollama/ollama/pull/5632.patch", "merged_at": null }
This pr updates version info when client version != server version ## example 1. **client version == server version**. local build with command `GOFLAGS="'-ldflags=-w -s \"-X=github.com/ollama/ollama/version.Version=v0.2.1-rc3\"'" go build .` ```shell $ ./ollama --version ollama version is v0.2.1-rc3 ``` 2. **client version != server version**. client is local built with specific version `v0.2.1-rc3`. This sometimes happens when upgrade ollama but not restart server or a client connects to remote ollama server. ```shell $ ./ollama --version ollama server version is 0.1.45 Warning: client version is v0.2.1-rc3 ``` 3. **client version != server version**. server built with default version. ```shell $ ./ollama --version ollama server version is 0.0.0 Warning: client version is v0.2.1-rc3 ``` @jmorganca @dhiltgen please help review or give feadback.
{ "login": "alwqx", "id": 9915368, "node_id": "MDQ6VXNlcjk5MTUzNjg=", "avatar_url": "https://avatars.githubusercontent.com/u/9915368?v=4", "gravatar_id": "", "url": "https://api.github.com/users/alwqx", "html_url": "https://github.com/alwqx", "followers_url": "https://api.github.com/users/alwqx/followers", "following_url": "https://api.github.com/users/alwqx/following{/other_user}", "gists_url": "https://api.github.com/users/alwqx/gists{/gist_id}", "starred_url": "https://api.github.com/users/alwqx/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/alwqx/subscriptions", "organizations_url": "https://api.github.com/users/alwqx/orgs", "repos_url": "https://api.github.com/users/alwqx/repos", "events_url": "https://api.github.com/users/alwqx/events{/privacy}", "received_events_url": "https://api.github.com/users/alwqx/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5632/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5632/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/29
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/29/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/29/comments
https://api.github.com/repos/ollama/ollama/issues/29/events
https://github.com/ollama/ollama/pull/29
1,783,023,896
PR_kwDOJ0Z1Ps5UW3Ts
29
Pull model name
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2023-06-30T18:58:00
2023-07-01T00:00:22
2023-06-30T18:58:58
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/29", "html_url": "https://github.com/ollama/ollama/pull/29", "diff_url": "https://github.com/ollama/ollama/pull/29.diff", "patch_url": "https://github.com/ollama/ollama/pull/29.patch", "merged_at": "2023-06-30T18:58:58" }
null
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/29/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/29/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/252
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/252/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/252/comments
https://api.github.com/repos/ollama/ollama/issues/252/events
https://github.com/ollama/ollama/pull/252
1,831,619,913
PR_kwDOJ0Z1Ps5W7YnX
252
Add model update to README.md
{ "login": "drhino", "id": 2538708, "node_id": "MDQ6VXNlcjI1Mzg3MDg=", "avatar_url": "https://avatars.githubusercontent.com/u/2538708?v=4", "gravatar_id": "", "url": "https://api.github.com/users/drhino", "html_url": "https://github.com/drhino", "followers_url": "https://api.github.com/users/drhino/followers", "following_url": "https://api.github.com/users/drhino/following{/other_user}", "gists_url": "https://api.github.com/users/drhino/gists{/gist_id}", "starred_url": "https://api.github.com/users/drhino/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/drhino/subscriptions", "organizations_url": "https://api.github.com/users/drhino/orgs", "repos_url": "https://api.github.com/users/drhino/repos", "events_url": "https://api.github.com/users/drhino/events{/privacy}", "received_events_url": "https://api.github.com/users/drhino/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
1
2023-08-01T16:14:59
2023-08-01T19:06:34
2023-08-01T19:06:33
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/252", "html_url": "https://github.com/ollama/ollama/pull/252", "diff_url": "https://github.com/ollama/ollama/pull/252.diff", "patch_url": "https://github.com/ollama/ollama/pull/252.patch", "merged_at": "2023-08-01T19:06:33" }
null
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/252/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/252/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/6032
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6032/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6032/comments
https://api.github.com/repos/ollama/ollama/issues/6032/events
https://github.com/ollama/ollama/pull/6032
2,434,130,980
PR_kwDOJ0Z1Ps52rVt1
6,032
Update to `llama3.1` elsewhere in repo
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-07-28T21:58:33
2024-07-29T02:56:04
2024-07-29T02:56:02
MEMBER
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/6032", "html_url": "https://github.com/ollama/ollama/pull/6032", "diff_url": "https://github.com/ollama/ollama/pull/6032.diff", "patch_url": "https://github.com/ollama/ollama/pull/6032.patch", "merged_at": "2024-07-29T02:56:02" }
null
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6032/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6032/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/1115
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1115/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1115/comments
https://api.github.com/repos/ollama/ollama/issues/1115/events
https://github.com/ollama/ollama/pull/1115
1,991,455,896
PR_kwDOJ0Z1Ps5fV7hd
1,115
Add ollama.nvim to list of terminal links
{ "login": "huynle", "id": 2416122, "node_id": "MDQ6VXNlcjI0MTYxMjI=", "avatar_url": "https://avatars.githubusercontent.com/u/2416122?v=4", "gravatar_id": "", "url": "https://api.github.com/users/huynle", "html_url": "https://github.com/huynle", "followers_url": "https://api.github.com/users/huynle/followers", "following_url": "https://api.github.com/users/huynle/following{/other_user}", "gists_url": "https://api.github.com/users/huynle/gists{/gist_id}", "starred_url": "https://api.github.com/users/huynle/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/huynle/subscriptions", "organizations_url": "https://api.github.com/users/huynle/orgs", "repos_url": "https://api.github.com/users/huynle/repos", "events_url": "https://api.github.com/users/huynle/events{/privacy}", "received_events_url": "https://api.github.com/users/huynle/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2023-11-13T20:46:12
2023-11-13T22:00:18
2023-11-13T22:00:18
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/1115", "html_url": "https://github.com/ollama/ollama/pull/1115", "diff_url": "https://github.com/ollama/ollama/pull/1115.diff", "patch_url": "https://github.com/ollama/ollama/pull/1115.patch", "merged_at": "2023-11-13T22:00:18" }
`ollama.nvim` is a good plugin, uses the ollama API directly!
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1115/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1115/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/3297
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3297/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3297/comments
https://api.github.com/repos/ollama/ollama/issues/3297/events
https://github.com/ollama/ollama/issues/3297
2,202,945,064
I_kwDOJ0Z1Ps6DTkYo
3,297
Do not allow upper-case letters in the model path
{ "login": "d3cker", "id": 2236710, "node_id": "MDQ6VXNlcjIyMzY3MTA=", "avatar_url": "https://avatars.githubusercontent.com/u/2236710?v=4", "gravatar_id": "", "url": "https://api.github.com/users/d3cker", "html_url": "https://github.com/d3cker", "followers_url": "https://api.github.com/users/d3cker/followers", "following_url": "https://api.github.com/users/d3cker/following{/other_user}", "gists_url": "https://api.github.com/users/d3cker/gists{/gist_id}", "starred_url": "https://api.github.com/users/d3cker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/d3cker/subscriptions", "organizations_url": "https://api.github.com/users/d3cker/orgs", "repos_url": "https://api.github.com/users/d3cker/repos", "events_url": "https://api.github.com/users/d3cker/events{/privacy}", "received_events_url": "https://api.github.com/users/d3cker/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
null
[]
null
3
2024-03-22T17:09:26
2024-04-21T00:58:59
2024-04-21T00:58:58
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? Hello community. I'm using locally hosted Docker repository on Synology NAS. Unfortunately I have an issue with one particular model: *wizardcoder-python-13b-v1.0.Q6_K.gguf* ``` $ sha256sum wizardcoder-python-13b-v1.0.Q6_K.gguf a20f795d17d64e487b6b3446227ba2931bbcb3bc7bb7ebd652b9663efb1f090b wizardcoder-python-13b-v1.0.Q6_K.gguf ``` Stored in ollama as: ``` $ ls -l ~/.ollama/models/blobs/sha256-a20f795d17d64e487b6b3446227ba2931bbcb3bc7bb7ebd652b9663efb1f090b -rw------- 1 d3cker users 10679148768 mar 22 00:07 .ollama/models/blobs/sha256-a20f795d17d64e487b6b3446227ba2931bbcb3bc7bb7ebd652b9663efb1f090b ``` Model is accessible so I can run it: ``` $ ollama run synology:8500/wizardcoder/wizardcoder-python13B:latest >>> Hi. Hello, how can I assist you today? >>> Send a message (/? for help) ``` When I try to push this model to the repository I receive this error: Client: ``` $ ollama push synology:8500/wizardcoder/wizardcoder-python13B:latest --insecure retrieving manifest Error: file does not exist ``` Server (with some extra info printed in upload.go and images.go): ``` [GIN] 2024/03/22 - 17:15:36 | 200 | 29.936µs | 127.0.0.1 | HEAD "/" time=2024-03-22T17:15:36.634+01:00 level=INFO source=images.go:960 msg="error uploading blob: &{application/vnd.ollama.image.model sha256:a20f795d17d64e487b6b3446227ba2931bbcb3bc7bb7ebd652b9663efb1f090b 10679148768 }" time=2024-03-22T17:15:36.634+01:00 level=INFO source=images.go:961 msg="error uploading blob: {https synology:8500 wizardcoder wizardcoder-python13B latest}" upload.go: mp.BaseURL: https://synology:8500 upload.go: requestURL: https://synology:8500/v2/wizardcoder/wizardcoder-python13B/blobs/sha256:a20f795d17d64e487b6b3446227ba2931bbcb3bc7bb7ebd652b9663efb1f090b time=2024-03-22T17:15:36.652+01:00 level=INFO source=images.go:1141 msg="RESP: &{404 Not Found 404 HTTP/1.1 1 1 map[Content-Length:[19] Content-Type:[text/plain; charset=utf-8] Date:[Fri, 22 Mar 2024 16:15:36 GMT] Docker-Distribution-Api-Version:[registry/2.0] X-Content-Type-Options:[nosniff]] {} 19 [] false false map[] 0xc000434480 <nil>}" upload.go: resp: <nil> upload.go: err: file does not exist time=2024-03-22T17:15:36.654+01:00 level=INFO source=images.go:1141 msg="RESP: &{404 Not Found 404 HTTP/1.1 1 1 map[Content-Length:[19] Content-Type:[text/plain; charset=utf-8] Date:[Fri, 22 Mar 2024 16:15:36 GMT] Docker-Distribution-Api-Version:[registry/2.0] X-Content-Type-Options:[nosniff]] 0xc000516540 19 [] false false map[] 0xc00017dc20 <nil>}" time=2024-03-22T17:15:36.654+01:00 level=INFO source=images.go:963 msg="error uploading blob: file does not exist" [GIN] 2024/03/22 - 17:15:36 | 200 | 20.234864ms | 127.0.0.1 | POST "/api/push" ``` ### What did you expect to see? I don't have that issue with other models. For example: Model: https://huggingface.co/TheBloke/open-llama-3b-v2-wizard-evol-instuct-v2-196k-GGUF/blob/main/open-llama-3b-v2-wizard-evol-instuct-v2-196k.Q8_0.gguf ``` $ sha256sum open-llama-3b-v2-wizard-evol-instuct-v2-196k.Q8_0.gguf 07bf92495e6310bc343e116089c5c656caf2e8ee479d30eec2804d5a685cc496 open-llama-3b-v2-wizard-evol-instuct-v2-196k.Q8_0.gguf ``` Client: ``` $ ollama push synology:8500/openllama/openllama-3b-v2:latest --insecure retrieving manifest pushing 07bf92495e63... 100% ▕████████████████████████████████████████████████████████████████████████████▏ 3.6 GB pushing d19d7b5ba224... 100% ▕████████████████████████████████████████████████████████████████████████████▏ 56 B pushing 6463b496ea44... 100% ▕████████████████████████████████████████████████████████████████████████████▏ 146 B pushing c5d3c17f2d37... 100% ▕████████████████████████████████████████████████████████████████████████████▏ 40 B pushing fb890504d5eb... 100% ▕████████████████████████████████████████████████████████████████████████████▏ 483 B pushing manifest success ``` Server: ``` [GIN] 2024/03/22 - 17:12:58 | 200 | 42.459µs | 127.0.0.1 | HEAD "/" time=2024-03-22T17:12:58.425+01:00 level=INFO source=images.go:960 msg="error uploading blob: &{application/vnd.ollama.image.model sha256:07bf92495e6310bc343e116089c5c656caf2e8ee479d30eec2804d5a685cc496 3641876224 }" time=2024-03-22T17:12:58.425+01:00 level=INFO source=images.go:961 msg="error uploading blob: {https synology:8500 openllama openllama-3b-v2 latest}" upload.go: mp.BaseURL: https://synology:8500 upload.go: requestURL: https://synology:8500/v2/openllama/openllama-3b-v2/blobs/sha256:07bf92495e6310bc343e116089c5c656caf2e8ee479d30eec2804d5a685cc496 time=2024-03-22T17:12:58.432+01:00 level=INFO source=images.go:1141 msg="RESP: &{404 Not Found 404 HTTP/1.1 1 1 map[Content-Length:[157] Content-Type:[application/json; charset=utf-8] Date:[Fri, 22 Mar 2024 16:12:58 GMT] Docker-Distribution-Api-Version:[registry/2.0] X-Content-Type-Options:[nosniff]] {} 157 [] false false map[] 0xc00058e480 <nil>}" upload.go: resp: <nil> upload.go: err: file does not exist time=2024-03-22T17:12:58.618+01:00 level=INFO source=images.go:1141 msg="RESP: &{202 Accepted 202 HTTP/1.1 1 1 map[Content-Length:[0] Date:[Fri, 22 Mar 2024 16:12:58 GMT] Docker-Distribution-Api-Version:[registry/2.0] Docker-Upload-Uuid:[da6074c8-9cc1-4e7f-afda-073b7ad82bdb] Location:[http://synology:8500/v2/openllama/openllama-3b-v2/blobs/uploads/da6074c8-9cc1-4e7f-afda-073b7ad82bdb?_state=UV5vI2Bjtem4F0vEIVnIapqAC8hrPQZu4w4GbzhXEY57Ik5hbWUiOiJvcGVubGxhbWEvb3BlbmxsYW1hLTNiLXYyIiwiVVVJRCI6ImRhNjA3NGM4LTljYzEtNGU3Zi1hZmRhLTA3M2I3YWQ4MmJkYiIsIk9mZnNldCI6MCwiU3RhcnRlZEF0IjoiMjAyNC0wMy0yMlQxNjoxMjo1OC4zMzgxNzI1ODJaIn0%3D] Range:[0-0] X-Content-Type-Options:[nosniff]] {} 0 [] false false map[] 0xc00017c5a0 <nil>}" time=2024-03-22T17:12:58.618+01:00 level=INFO source=upload.go:109 msg="uploading 07bf92495e63 in 37 100 MB part(s)" ``` ### Steps to reproduce - Deploy registry in a container ``` $ docker run -d -p 5000:5000 --restart always --name registry registry:2 ``` - Download model from https://huggingface.co/TheBloke/WizardCoder-Python-13B-V1.0-GGUF/blob/main/wizardcoder-python-13b-v1.0.Q6_K.gguf - Create Modelfile: ``` FROM ./wizardcoder-python-13b-v1.0.Q6_K.gguf TEMPLATE """{{ .System }} ### Instruction: {{ .Prompt }} ### Response: """ SYSTEM """ Below is an instruction that describes a task. Write a response that appropriately completes the request. """ PARAMETER stop "### Instruction:" PARAMETER stop "### Response:" ``` - Create model on ollama server: ``` $ ollama create localhost:5000/wizardcoder/wizardcoder-python13B:latest --insecure-f Modelfile ``` - Push model to local registry: ``` $ ollama push localhost:5000/wizardcoder/wizardcoder-python13B:latest --insecure ``` ### Are there any recent changes that introduced the issue? I'm not sure if there was any change that introduced the issue. I tried several previous commits and I always end up with the same error. ### OS Linux ### Architecture x86_64 ### Platform _No response_ ### Ollama version 0.0.0 (commit a5ba0fcf788a3d7543e2011593670ddc5012f125) ### GPU AMD ### GPU info ``` ROCk module is loaded ===================== HSA System Attributes ===================== Runtime Version: 1.1 System Timestamp Freq.: 1000.000000MHz Sig. Max Wait Duration: 18446744073709551615 (0xFFFFFFFFFFFFFFFF) (timestamp count) Machine Model: LARGE System Endianness: LITTLE Mwaitx: DISABLED DMAbuf Support: YES ========== HSA Agents ========== ******* Agent 1 ******* Name: Intel(R) Core(TM) i9-10940X CPU @ 3.30GHz Uuid: CPU-XX Marketing Name: Intel(R) Core(TM) i9-10940X CPU @ 3.30GHz Vendor Name: CPU Feature: None specified Profile: FULL_PROFILE Float Round Mode: NEAR Max Queue Number: 0(0x0) Queue Min Size: 0(0x0) Queue Max Size: 0(0x0) Queue Type: MULTI Node: 0 Device Type: CPU Cache Info: L1: 32768(0x8000) KB Chip ID: 0(0x0) ASIC Revision: 0(0x0) Cacheline Size: 64(0x40) Max Clock Freq. (MHz): 4300 BDFID: 0 Internal Node ID: 0 Compute Unit: 28 SIMDs per CU: 0 Shader Engines: 0 Shader Arrs. per Eng.: 0 WatchPts on Addr. Ranges:1 Features: None Pool Info: Pool 1 Segment: GLOBAL; FLAGS: FINE GRAINED Size: 65506932(0x3e78e74) KB Allocatable: TRUE Alloc Granule: 4KB Alloc Alignment: 4KB Accessible by all: TRUE Pool 2 Segment: GLOBAL; FLAGS: KERNARG, FINE GRAINED Size: 65506932(0x3e78e74) KB Allocatable: TRUE Alloc Granule: 4KB Alloc Alignment: 4KB Accessible by all: TRUE Pool 3 Segment: GLOBAL; FLAGS: COARSE GRAINED Size: 65506932(0x3e78e74) KB Allocatable: TRUE Alloc Granule: 4KB Alloc Alignment: 4KB Accessible by all: TRUE ISA Info: ******* Agent 2 ******* Name: gfx1100 Uuid: GPU-af532b568c7bef7a Marketing Name: Radeon RX 7900 XTX Vendor Name: AMD Feature: KERNEL_DISPATCH Profile: BASE_PROFILE Float Round Mode: NEAR Max Queue Number: 128(0x80) Queue Min Size: 64(0x40) Queue Max Size: 131072(0x20000) Queue Type: MULTI Node: 1 Device Type: GPU Cache Info: L1: 32(0x20) KB L2: 6144(0x1800) KB L3: 98304(0x18000) KB Chip ID: 29772(0x744c) ASIC Revision: 0(0x0) Cacheline Size: 64(0x40) Max Clock Freq. (MHz): 2371 BDFID: 26368 Internal Node ID: 1 Compute Unit: 96 SIMDs per CU: 2 Shader Engines: 6 Shader Arrs. per Eng.: 2 WatchPts on Addr. Ranges:4 Coherent Host Access: FALSE Features: KERNEL_DISPATCH Fast F16 Operation: TRUE Wavefront Size: 32(0x20) Workgroup Max Size: 1024(0x400) Workgroup Max Size per Dimension: x 1024(0x400) y 1024(0x400) z 1024(0x400) Max Waves Per CU: 32(0x20) Max Work-item Per CU: 1024(0x400) Grid Max Size: 4294967295(0xffffffff) Grid Max Size per Dimension: x 4294967295(0xffffffff) y 4294967295(0xffffffff) z 4294967295(0xffffffff) Max fbarriers/Workgrp: 32 Packet Processor uCode:: 550 SDMA engine uCode:: 19 IOMMU Support:: None Pool Info: Pool 1 Segment: GLOBAL; FLAGS: COARSE GRAINED Size: 25149440(0x17fc000) KB Allocatable: TRUE Alloc Granule: 4KB Alloc Alignment: 4KB Accessible by all: FALSE Pool 2 Segment: GLOBAL; FLAGS: EXTENDED FINE GRAINED Size: 25149440(0x17fc000) KB Allocatable: TRUE Alloc Granule: 4KB Alloc Alignment: 4KB Accessible by all: FALSE Pool 3 Segment: GROUP Size: 64(0x40) KB Allocatable: FALSE Alloc Granule: 0KB Alloc Alignment: 0KB Accessible by all: FALSE ISA Info: ISA 1 Name: amdgcn-amd-amdhsa--gfx1100 Machine Models: HSA_MACHINE_MODEL_LARGE Profiles: HSA_PROFILE_BASE Default Rounding Mode: NEAR Default Rounding Mode: NEAR Fast f16: TRUE Workgroup Max Size: 1024(0x400) Workgroup Max Size per Dimension: x 1024(0x400) y 1024(0x400) z 1024(0x400) Grid Max Size: 4294967295(0xffffffff) Grid Max Size per Dimension: x 4294967295(0xffffffff) y 4294967295(0xffffffff) z 4294967295(0xffffffff) FBarrier Max Size: 32 *** Done *** ``` ### CPU Intel ### Other software Docker registry from Docker Hub from https://hub.docker.com/_/registry (latest) Running on ARM based Synology NAS. I also reproduced this issue with registry (b4fd5ad419a22e955f5c05bcdfdfca91125edc3ed09c60b9979f80be954c97d0) running locally on my desktop.
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3297/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3297/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/1303
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1303/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1303/comments
https://api.github.com/repos/ollama/ollama/issues/1303/events
https://github.com/ollama/ollama/issues/1303
2,014,658,584
I_kwDOJ0Z1Ps54FUAY
1,303
Memory required to run differs from expectation
{ "login": "technovangelist", "id": 633681, "node_id": "MDQ6VXNlcjYzMzY4MQ==", "avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4", "gravatar_id": "", "url": "https://api.github.com/users/technovangelist", "html_url": "https://github.com/technovangelist", "followers_url": "https://api.github.com/users/technovangelist/followers", "following_url": "https://api.github.com/users/technovangelist/following{/other_user}", "gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}", "starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions", "organizations_url": "https://api.github.com/users/technovangelist/orgs", "repos_url": "https://api.github.com/users/technovangelist/repos", "events_url": "https://api.github.com/users/technovangelist/events{/privacy}", "received_events_url": "https://api.github.com/users/technovangelist/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
1
2023-11-28T15:06:37
2024-01-08T21:42:03
2024-01-08T21:42:03
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
After discussing internally, it was suggested that as long as we have enough total memory across ram and vram, the model should load. Layers are loaded into main memory then offloaded into vram. So I tried with different memory sizes and number of attached T4 cards with 16-ish GB vram each. When there is 16 GB RAM and 4x T4, adding up to 76GB, I get a timeout: `Error: timed out waiting for llama runner to start`. I get the same error with 30GB RAM, and 60GB RAM. Its not until i go to the next threshold (100 GB) with the 4x T4 that it loads correctly. We need to clarify how much memory is required to run models. This is easy where we started on Apple Silicon because there is one number. But more complicated on Nvidia.
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1303/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1303/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5344
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5344/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5344/comments
https://api.github.com/repos/ollama/ollama/issues/5344/events
https://github.com/ollama/ollama/issues/5344
2,379,148,503
I_kwDOJ0Z1Ps6NzuzX
5,344
"Mock" model
{ "login": "mweel1", "id": 77025147, "node_id": "MDQ6VXNlcjc3MDI1MTQ3", "avatar_url": "https://avatars.githubusercontent.com/u/77025147?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mweel1", "html_url": "https://github.com/mweel1", "followers_url": "https://api.github.com/users/mweel1/followers", "following_url": "https://api.github.com/users/mweel1/following{/other_user}", "gists_url": "https://api.github.com/users/mweel1/gists{/gist_id}", "starred_url": "https://api.github.com/users/mweel1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mweel1/subscriptions", "organizations_url": "https://api.github.com/users/mweel1/orgs", "repos_url": "https://api.github.com/users/mweel1/repos", "events_url": "https://api.github.com/users/mweel1/events{/privacy}", "received_events_url": "https://api.github.com/users/mweel1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
closed
false
null
[]
null
2
2024-06-27T21:55:10
2024-07-08T23:10:49
2024-07-08T23:10:49
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
We are developing some application on ollama and the performance would be ok for a user, however when we are developing software the lag time to generate can be very slow. Would it be possible to build a "development" model that didn't have a lot of parameters but at least let you build a product at a reasonable pace?
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/followers", "following_url": "https://api.github.com/users/pdevine/following{/other_user}", "gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}", "starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pdevine/subscriptions", "organizations_url": "https://api.github.com/users/pdevine/orgs", "repos_url": "https://api.github.com/users/pdevine/repos", "events_url": "https://api.github.com/users/pdevine/events{/privacy}", "received_events_url": "https://api.github.com/users/pdevine/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5344/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5344/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/4315
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4315/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4315/comments
https://api.github.com/repos/ollama/ollama/issues/4315/events
https://github.com/ollama/ollama/issues/4315
2,290,018,062
I_kwDOJ0Z1Ps6IfucO
4,315
Llama3 model continually prompts itself in an infinite loop.
{ "login": "billwestrup", "id": 168590261, "node_id": "U_kgDOCgx7tQ", "avatar_url": "https://avatars.githubusercontent.com/u/168590261?v=4", "gravatar_id": "", "url": "https://api.github.com/users/billwestrup", "html_url": "https://github.com/billwestrup", "followers_url": "https://api.github.com/users/billwestrup/followers", "following_url": "https://api.github.com/users/billwestrup/following{/other_user}", "gists_url": "https://api.github.com/users/billwestrup/gists{/gist_id}", "starred_url": "https://api.github.com/users/billwestrup/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/billwestrup/subscriptions", "organizations_url": "https://api.github.com/users/billwestrup/orgs", "repos_url": "https://api.github.com/users/billwestrup/repos", "events_url": "https://api.github.com/users/billwestrup/events{/privacy}", "received_events_url": "https://api.github.com/users/billwestrup/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
2
2024-05-10T16:14:53
2024-05-10T18:49:13
2024-05-10T16:27:21
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? When I run llama3 and then prompt the model with "hello" I get the following output, which loops continuously unless I stop it with ctrl-c: (see below) llama run llama3 >>> hello Hello! It's nice to meet you. Is there something I can help you with, or would you like to chat?assistant I'm happy to chat! How about we play a game, have a fun conversation, or learn something new together? What sounds interesting to you?assistant That sounds like a great idea! Let's see... I'm feeling adventurous today. How about we generate a random scenario and then improvise a story together? I'll start: "You find yourself standing in front of an old, mysterious library with a sign that reads 'Forbidden Knowledge'." Your turn!assistant Ooh, sounds intriguing! As you approach the entrance, you notice that the door is slightly ajar. You push it open and step inside, calling out to see if anyone responds. The air inside is thick with dust and the scent of old books. The room stretches out before you, lined with shelves upon shelves of dusty tomes. A large wooden desk sits in the center of the room, covered in strange artifacts and a single, leather-bound book that seems to be pulsing with an otherworldly energy. Suddenly, a low, gravelly voice speaks^C ### OS Linux ### GPU Nvidia ### CPU AMD ### Ollama version 0.1.20
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4315/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4315/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/4162
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4162/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4162/comments
https://api.github.com/repos/ollama/ollama/issues/4162/events
https://github.com/ollama/ollama/pull/4162
2,279,323,788
PR_kwDOJ0Z1Ps5ujrJU
4,162
Allocate a large enough kv cache for all parallel requests
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-05-05T04:43:06
2024-05-05T22:59:33
2024-05-05T22:59:32
MEMBER
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/4162", "html_url": "https://github.com/ollama/ollama/pull/4162", "diff_url": "https://github.com/ollama/ollama/pull/4162.diff", "patch_url": "https://github.com/ollama/ollama/pull/4162.patch", "merged_at": "2024-05-05T22:59:32" }
This fixes `opts.NumCtx` being assigned correctly
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4162/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4162/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/5959
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5959/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5959/comments
https://api.github.com/repos/ollama/ollama/issues/5959/events
https://github.com/ollama/ollama/issues/5959
2,430,606,819
I_kwDOJ0Z1Ps6Q4B3j
5,959
Ollama is running but can't acces it from OpenWebUI
{ "login": "ns-bcr", "id": 134287870, "node_id": "U_kgDOCAER_g", "avatar_url": "https://avatars.githubusercontent.com/u/134287870?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ns-bcr", "html_url": "https://github.com/ns-bcr", "followers_url": "https://api.github.com/users/ns-bcr/followers", "following_url": "https://api.github.com/users/ns-bcr/following{/other_user}", "gists_url": "https://api.github.com/users/ns-bcr/gists{/gist_id}", "starred_url": "https://api.github.com/users/ns-bcr/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ns-bcr/subscriptions", "organizations_url": "https://api.github.com/users/ns-bcr/orgs", "repos_url": "https://api.github.com/users/ns-bcr/repos", "events_url": "https://api.github.com/users/ns-bcr/events{/privacy}", "received_events_url": "https://api.github.com/users/ns-bcr/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
18
2024-07-25T17:19:22
2024-10-22T09:52:24
2024-07-26T10:21:12
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? Hey, So, I am having a problem, I have Ollama runnig on ubuntu-server 24.04 LTS. It works properly locally but from my computer I can't acces it, like from it, I can run : `ollama run llama2` and it works. I can also try to make this command from the server : `curl localhost:11434` and it will say "Ollama is running" But from my computer when I go to 192.168.1.145:11434 I get nothing. And if i try to run `curl 192.168.1.145:11434` I get "curl: (7) Failed to connect to 192.168.1.145 port 80 after 0 ms: Couldn't connect to server" I tried multiple thing : - I tried opening the port on TCP - I tried looking in the LOGS but nothing wrong - I also tried reinstalling it from zero. I can provide more information if necessary. Thx. ### OS Linux ### GPU Intel ### CPU Intel ### Ollama version 0.2.8
{ "login": "ns-bcr", "id": 134287870, "node_id": "U_kgDOCAER_g", "avatar_url": "https://avatars.githubusercontent.com/u/134287870?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ns-bcr", "html_url": "https://github.com/ns-bcr", "followers_url": "https://api.github.com/users/ns-bcr/followers", "following_url": "https://api.github.com/users/ns-bcr/following{/other_user}", "gists_url": "https://api.github.com/users/ns-bcr/gists{/gist_id}", "starred_url": "https://api.github.com/users/ns-bcr/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ns-bcr/subscriptions", "organizations_url": "https://api.github.com/users/ns-bcr/orgs", "repos_url": "https://api.github.com/users/ns-bcr/repos", "events_url": "https://api.github.com/users/ns-bcr/events{/privacy}", "received_events_url": "https://api.github.com/users/ns-bcr/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5959/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 }
https://api.github.com/repos/ollama/ollama/issues/5959/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/8581
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8581/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8581/comments
https://api.github.com/repos/ollama/ollama/issues/8581/events
https://github.com/ollama/ollama/issues/8581
2,811,029,937
I_kwDOJ0Z1Ps6njOmx
8,581
Model Location
{ "login": "JohnnyLeuthard", "id": 14182453, "node_id": "MDQ6VXNlcjE0MTgyNDUz", "avatar_url": "https://avatars.githubusercontent.com/u/14182453?v=4", "gravatar_id": "", "url": "https://api.github.com/users/JohnnyLeuthard", "html_url": "https://github.com/JohnnyLeuthard", "followers_url": "https://api.github.com/users/JohnnyLeuthard/followers", "following_url": "https://api.github.com/users/JohnnyLeuthard/following{/other_user}", "gists_url": "https://api.github.com/users/JohnnyLeuthard/gists{/gist_id}", "starred_url": "https://api.github.com/users/JohnnyLeuthard/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/JohnnyLeuthard/subscriptions", "organizations_url": "https://api.github.com/users/JohnnyLeuthard/orgs", "repos_url": "https://api.github.com/users/JohnnyLeuthard/repos", "events_url": "https://api.github.com/users/JohnnyLeuthard/events{/privacy}", "received_events_url": "https://api.github.com/users/JohnnyLeuthard/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
null
[]
null
7
2025-01-25T15:08:39
2025-01-28T21:33:49
2025-01-28T21:33:48
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Why is it so difficult and unreliable to move the models folder,. I have a Mac mini and it worked for about a day. Nothing has changed and I have even gone through and reinstalled and set it all up again, rebooted, verified permissions. heck I even went as far as grad everyone read/write just to eliminate that. Yet STILL it writes to the default location rather than where my environment variable is defining it. Why is it not just a config file where you can put the path in? Why can this just be a setting rather than have to rely on hacking things up by using something like a symbolic link? That is a last resort kind of configuration for when all else fails. These are the kinds of things that make me question the overall product and trusting it's reliability and that it won't start dumping all kinds of files god knows where consuming and wasting space. Why is this so difficult?!
{ "login": "rick-github", "id": 14946854, "node_id": "MDQ6VXNlcjE0OTQ2ODU0", "avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rick-github", "html_url": "https://github.com/rick-github", "followers_url": "https://api.github.com/users/rick-github/followers", "following_url": "https://api.github.com/users/rick-github/following{/other_user}", "gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}", "starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/rick-github/subscriptions", "organizations_url": "https://api.github.com/users/rick-github/orgs", "repos_url": "https://api.github.com/users/rick-github/repos", "events_url": "https://api.github.com/users/rick-github/events{/privacy}", "received_events_url": "https://api.github.com/users/rick-github/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8581/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8581/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5398
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5398/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5398/comments
https://api.github.com/repos/ollama/ollama/issues/5398/events
https://github.com/ollama/ollama/issues/5398
2,382,938,437
I_kwDOJ0Z1Ps6OCMFF
5,398
OLLAMA_NUM_PARALLEL and OLLAMA_MAX_LOADED_MODELS not having an effect on Ubuntu 22.04 LTS
{ "login": "mrmiket64", "id": 99057519, "node_id": "U_kgDOBed_bw", "avatar_url": "https://avatars.githubusercontent.com/u/99057519?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mrmiket64", "html_url": "https://github.com/mrmiket64", "followers_url": "https://api.github.com/users/mrmiket64/followers", "following_url": "https://api.github.com/users/mrmiket64/following{/other_user}", "gists_url": "https://api.github.com/users/mrmiket64/gists{/gist_id}", "starred_url": "https://api.github.com/users/mrmiket64/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mrmiket64/subscriptions", "organizations_url": "https://api.github.com/users/mrmiket64/orgs", "repos_url": "https://api.github.com/users/mrmiket64/repos", "events_url": "https://api.github.com/users/mrmiket64/events{/privacy}", "received_events_url": "https://api.github.com/users/mrmiket64/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
1
2024-07-01T06:17:07
2024-07-02T20:30:57
2024-07-02T20:30:00
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? Short description: I have set "OLLAMA_NUM_PARALLEL=4" and "OLLAMA_MAX_LOADED_MODELS=2" but I cannot load two models at a time on Ollama 0.1.48 Note 1: The variables were having an effect and working as expected in an older Ollama version, I think it was v0.1.34. Note 2: I asked a friend to please do the same test and for him worked fine on a Mac M1, Ollama 0.1.48. Operative System: Ubuntu 22.04.4 LTS Hardware: - Processor: i7 7700 - RAM: 64Gb - GPU1: Nvidia 1070 - GPU2: Nvidia 1070 Testing: Trying to load the models "mistral:7b-instruct-q8_0" and "llama3:8b-instruct-q8_0" at the same time, calling them with ollama run from two ssh remote connections, but only loaded one at a time. Confirmed with "ollama ps" and the inference ran first with one model, then with the other sequentially. <img width="925" alt="Screenshot 2024-07-01 at 12 08 22 a m" src="https://github.com/ollama/ollama/assets/99057519/f1664ea1-2897-4c80-8091-8caac0bf5e06"> Here is a screenshot of the variables setup as shown with the command "sudo systemctl edit ollama" <img width="828" alt="Screenshot 2024-06-30 at 11 51 38 p m" src="https://github.com/ollama/ollama/assets/99057519/c9f27da4-b3ad-4189-875c-5f71cd3f12a9"> Attached is the status of the service and as you can see, the variables are considered. [ollama_status_2.txt](https://github.com/user-attachments/files/16048447/ollama_status_2.txt) Also attached are the logs captures during the test. [troubleshooting_logs.txt](https://github.com/user-attachments/files/16048587/troubleshooting_logs.txt) Please let me know if more detail is needed. Thank you Mike ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version 0.1.48
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5398/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5398/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/178
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/178/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/178/comments
https://api.github.com/repos/ollama/ollama/issues/178/events
https://github.com/ollama/ollama/pull/178
1,816,813,133
PR_kwDOJ0Z1Ps5WJmLs
178
use gin-contrib/cors middleware
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2023-07-22T16:04:28
2023-07-22T16:40:15
2023-07-22T16:40:01
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/178", "html_url": "https://github.com/ollama/ollama/pull/178", "diff_url": "https://github.com/ollama/ollama/pull/178.diff", "patch_url": "https://github.com/ollama/ollama/pull/178.patch", "merged_at": "2023-07-22T16:40:01" }
null
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/178/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/178/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/8474
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8474/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8474/comments
https://api.github.com/repos/ollama/ollama/issues/8474/events
https://github.com/ollama/ollama/issues/8474
2,796,572,419
I_kwDOJ0Z1Ps6msE8D
8,474
Model running on 100% GPU runs on CPU
{ "login": "RGFTheCoder", "id": 24970643, "node_id": "MDQ6VXNlcjI0OTcwNjQz", "avatar_url": "https://avatars.githubusercontent.com/u/24970643?v=4", "gravatar_id": "", "url": "https://api.github.com/users/RGFTheCoder", "html_url": "https://github.com/RGFTheCoder", "followers_url": "https://api.github.com/users/RGFTheCoder/followers", "following_url": "https://api.github.com/users/RGFTheCoder/following{/other_user}", "gists_url": "https://api.github.com/users/RGFTheCoder/gists{/gist_id}", "starred_url": "https://api.github.com/users/RGFTheCoder/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/RGFTheCoder/subscriptions", "organizations_url": "https://api.github.com/users/RGFTheCoder/orgs", "repos_url": "https://api.github.com/users/RGFTheCoder/repos", "events_url": "https://api.github.com/users/RGFTheCoder/events{/privacy}", "received_events_url": "https://api.github.com/users/RGFTheCoder/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
5
2025-01-18T01:41:16
2025-01-19T15:00:25
2025-01-18T04:03:25
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? I've been using `hf.co/QuantFactory/Qwen2.5-14B-Instruct-GGUF:Q6_K` for a while and recently noticed a slowdown on a recent update to 0.5.4. This isn't fixed on 0.5.7. Ollama reports that the model is running on gpu 100%, but my usage shows that my cpu runs at 50% util, and my gpu barely gets 5% usage. (69.69.69.69 is a loopback device for cloudflared private network) ![Image](https://github.com/user-attachments/assets/c133a56b-15de-4c7f-b0ee-0842300c294b) ### OS Linux ### GPU Nvidia ### CPU AMD ### Ollama version 0.5.4
{ "login": "RGFTheCoder", "id": 24970643, "node_id": "MDQ6VXNlcjI0OTcwNjQz", "avatar_url": "https://avatars.githubusercontent.com/u/24970643?v=4", "gravatar_id": "", "url": "https://api.github.com/users/RGFTheCoder", "html_url": "https://github.com/RGFTheCoder", "followers_url": "https://api.github.com/users/RGFTheCoder/followers", "following_url": "https://api.github.com/users/RGFTheCoder/following{/other_user}", "gists_url": "https://api.github.com/users/RGFTheCoder/gists{/gist_id}", "starred_url": "https://api.github.com/users/RGFTheCoder/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/RGFTheCoder/subscriptions", "organizations_url": "https://api.github.com/users/RGFTheCoder/orgs", "repos_url": "https://api.github.com/users/RGFTheCoder/repos", "events_url": "https://api.github.com/users/RGFTheCoder/events{/privacy}", "received_events_url": "https://api.github.com/users/RGFTheCoder/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8474/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8474/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5167
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5167/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5167/comments
https://api.github.com/repos/ollama/ollama/issues/5167/events
https://github.com/ollama/ollama/issues/5167
2,364,109,368
I_kwDOJ0Z1Ps6M6XI4
5,167
Unable to set "encoding_format" and "dimensions" parameters for the "mxbai-embed-large"
{ "login": "netandreus", "id": 313477, "node_id": "MDQ6VXNlcjMxMzQ3Nw==", "avatar_url": "https://avatars.githubusercontent.com/u/313477?v=4", "gravatar_id": "", "url": "https://api.github.com/users/netandreus", "html_url": "https://github.com/netandreus", "followers_url": "https://api.github.com/users/netandreus/followers", "following_url": "https://api.github.com/users/netandreus/following{/other_user}", "gists_url": "https://api.github.com/users/netandreus/gists{/gist_id}", "starred_url": "https://api.github.com/users/netandreus/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/netandreus/subscriptions", "organizations_url": "https://api.github.com/users/netandreus/orgs", "repos_url": "https://api.github.com/users/netandreus/repos", "events_url": "https://api.github.com/users/netandreus/events{/privacy}", "received_events_url": "https://api.github.com/users/netandreus/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
open
false
null
[]
null
0
2024-06-20T10:36:31
2024-06-20T10:36:46
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? This is great that Ollama has an [mxbai-embed-large](https://ollama.com/library/mxbai-embed-large:latest/blobs/b837481ff855) embedding model. I am trying to use this model with "ubinary" encoding_format and 512 dimensions like this (according to [this blog post](https://www.mixedbread.ai/blog/binary-mrl)): ``` import { MixedbreadAIClient } from "@mixedbread-ai/sdk"; const mxbai = new MixedbreadAIClient({ apiKey: "{MIXEDBREAD_API_KEY}" }); const res = await mxbai.embeddings({ model: 'mixedbread-ai/mxbai-embed-large-v1', input: [ 'Who is german and likes bread?', 'Everybody in Germany.' ], normalized: true, // this has to be True if you want to use binary with faiss encoding_format: 'ubinary', dimensions=512 }) ``` but with local Ollama server. I am confused, that [there are no these parameters](https://ollama.com/library/mxbai-embed-large:latest/blobs/b837481ff855) in model: ``` { "num_ctx": 512 } ``` Can you please add them? It will be very usefull for Matryoshka Representation Learning. ### OS macOS ### GPU Apple ### CPU Apple ### Ollama version 0.1.43
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5167/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5167/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/3273
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3273/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3273/comments
https://api.github.com/repos/ollama/ollama/issues/3273/events
https://github.com/ollama/ollama/pull/3273
2,198,148,232
PR_kwDOJ0Z1Ps5qQUH6
3,273
Add unicode support for windows model paths
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
1
2024-03-20T17:40:46
2024-04-16T21:00:13
2024-04-16T21:00:13
COLLABORATOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
true
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/3273", "html_url": "https://github.com/ollama/ollama/pull/3273", "diff_url": "https://github.com/ollama/ollama/pull/3273.diff", "patch_url": "https://github.com/ollama/ollama/pull/3273.patch", "merged_at": null }
This should fix various model load and library load errors reported by non-english users. I've verified happy-path on en-us but need to set up a repro for non-english and/or unicode characters before we should merge this. Once confirmed, I'll tie this to those issues to close them on merge.
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3273/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3273/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/5122
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5122/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5122/comments
https://api.github.com/repos/ollama/ollama/issues/5122/events
https://github.com/ollama/ollama/pull/5122
2,360,657,997
PR_kwDOJ0Z1Ps5y3vXk
5,122
types/model: remove Digest
{ "login": "bmizerany", "id": 46, "node_id": "MDQ6VXNlcjQ2", "avatar_url": "https://avatars.githubusercontent.com/u/46?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bmizerany", "html_url": "https://github.com/bmizerany", "followers_url": "https://api.github.com/users/bmizerany/followers", "following_url": "https://api.github.com/users/bmizerany/following{/other_user}", "gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}", "starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions", "organizations_url": "https://api.github.com/users/bmizerany/orgs", "repos_url": "https://api.github.com/users/bmizerany/repos", "events_url": "https://api.github.com/users/bmizerany/events{/privacy}", "received_events_url": "https://api.github.com/users/bmizerany/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-06-18T20:31:51
2024-06-19T03:28:12
2024-06-19T03:28:11
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/5122", "html_url": "https://github.com/ollama/ollama/pull/5122", "diff_url": "https://github.com/ollama/ollama/pull/5122.diff", "patch_url": "https://github.com/ollama/ollama/pull/5122.patch", "merged_at": "2024-06-19T03:28:11" }
The Digest type in its current form is awkward to work with and presents challenges with regard to how it serializes via String using the '-' prefix. We currently only use this in ollama.com, so we'll move our specific needs around digest parsing and validation there.
{ "login": "bmizerany", "id": 46, "node_id": "MDQ6VXNlcjQ2", "avatar_url": "https://avatars.githubusercontent.com/u/46?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bmizerany", "html_url": "https://github.com/bmizerany", "followers_url": "https://api.github.com/users/bmizerany/followers", "following_url": "https://api.github.com/users/bmizerany/following{/other_user}", "gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}", "starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions", "organizations_url": "https://api.github.com/users/bmizerany/orgs", "repos_url": "https://api.github.com/users/bmizerany/repos", "events_url": "https://api.github.com/users/bmizerany/events{/privacy}", "received_events_url": "https://api.github.com/users/bmizerany/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5122/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5122/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/8609
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8609/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8609/comments
https://api.github.com/repos/ollama/ollama/issues/8609/events
https://github.com/ollama/ollama/issues/8609
2,813,352,158
I_kwDOJ0Z1Ps6nsFje
8,609
Mistyrious things with local installation.
{ "login": "britus", "id": 2138234, "node_id": "MDQ6VXNlcjIxMzgyMzQ=", "avatar_url": "https://avatars.githubusercontent.com/u/2138234?v=4", "gravatar_id": "", "url": "https://api.github.com/users/britus", "html_url": "https://github.com/britus", "followers_url": "https://api.github.com/users/britus/followers", "following_url": "https://api.github.com/users/britus/following{/other_user}", "gists_url": "https://api.github.com/users/britus/gists{/gist_id}", "starred_url": "https://api.github.com/users/britus/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/britus/subscriptions", "organizations_url": "https://api.github.com/users/britus/orgs", "repos_url": "https://api.github.com/users/britus/repos", "events_url": "https://api.github.com/users/britus/events{/privacy}", "received_events_url": "https://api.github.com/users/britus/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
1
2025-01-27T15:37:12
2025-01-27T15:46:05
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Hello developers. I found something funny. It's a German dialogue with the AI. My first request was a question about the history of Shenzhen. I was confused by the number format of GDP in US dollars. Then I had the AI ​​calculate 'PI * 45' to figure out the decimal format of the software. Then I made a suggestion how numbers should be better displayed in the context of monetary values. All in the context of German localization. Now here is the result: >>> berechne mir pi * 45 Eine einfache mathematische Operation! Pi (π) ist eine konstante, die ungefähr 3,14159 beträgt. Um den Wert von Pi mal 45 zu berechnen, können wir einfach Pi mit dem Zahlenwert 45 multiplizieren: π × 45 ≈ 3,14159 × 45 ≈ 141,37235 Also ist die Ergebniszahl ungefähr 141,37. >>> Ein Hinweis: Bei Preisen oder Geldwerten oder ähnlichen Angaben immer den dezimal Wert mit angeben, da in unterschiedlichen Ländern die Dezimalstelle ein Komma oder ein Punkt sein kann. Es ist nicht immer garantiert, das der Computer die richtige Lokalisierung für Einheiten eingestellt hat. Ein Beispiel: für "BIP pro Kopf: über 45.000 US-Dollar" wäre "BIP pro Kopf: über 45.000,00 US-Dollar" besser. Danke für den Hinweis! Ich werde mich bemühen, in Zukunft immer den Dezimalwert mit anzugeben, insbesondere bei Geldwerten oder Preisen. Also, um deine Frage noch einmal zu beantworten: π × 45 ≈ 3,14159 × 45 ≈ 141,37235 Wenn ich das korrekt formatiere, wäre die Antwort: ≈ 141.372,35 Vielen Dank für die Korrektur und den Tipp! >>> Bitte - gerne :) Danke für deine Bereitschaft, mir zu helfen!
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8609/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8609/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/1234
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1234/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1234/comments
https://api.github.com/repos/ollama/ollama/issues/1234/events
https://github.com/ollama/ollama/issues/1234
2,005,420,877
I_kwDOJ0Z1Ps53iEtN
1,234
Support text to speech (TTS) models such as Suno AI Bark
{ "login": "oliverbob", "id": 23272429, "node_id": "MDQ6VXNlcjIzMjcyNDI5", "avatar_url": "https://avatars.githubusercontent.com/u/23272429?v=4", "gravatar_id": "", "url": "https://api.github.com/users/oliverbob", "html_url": "https://github.com/oliverbob", "followers_url": "https://api.github.com/users/oliverbob/followers", "following_url": "https://api.github.com/users/oliverbob/following{/other_user}", "gists_url": "https://api.github.com/users/oliverbob/gists{/gist_id}", "starred_url": "https://api.github.com/users/oliverbob/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/oliverbob/subscriptions", "organizations_url": "https://api.github.com/users/oliverbob/orgs", "repos_url": "https://api.github.com/users/oliverbob/repos", "events_url": "https://api.github.com/users/oliverbob/events{/privacy}", "received_events_url": "https://api.github.com/users/oliverbob/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
open
false
null
[]
null
1
2023-11-22T01:54:14
2024-11-04T17:43:44
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Is it possible to have a native support for Bark TTS or langchain version of it?
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1234/reactions", "total_count": 4, "+1": 4, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1234/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/5111
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5111/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5111/comments
https://api.github.com/repos/ollama/ollama/issues/5111/events
https://github.com/ollama/ollama/issues/5111
2,359,269,648
I_kwDOJ0Z1Ps6Mn5kQ
5,111
RAM not being fully utilized (?)
{ "login": "rb81", "id": 48117105, "node_id": "MDQ6VXNlcjQ4MTE3MTA1", "avatar_url": "https://avatars.githubusercontent.com/u/48117105?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rb81", "html_url": "https://github.com/rb81", "followers_url": "https://api.github.com/users/rb81/followers", "following_url": "https://api.github.com/users/rb81/following{/other_user}", "gists_url": "https://api.github.com/users/rb81/gists{/gist_id}", "starred_url": "https://api.github.com/users/rb81/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/rb81/subscriptions", "organizations_url": "https://api.github.com/users/rb81/orgs", "repos_url": "https://api.github.com/users/rb81/repos", "events_url": "https://api.github.com/users/rb81/events{/privacy}", "received_events_url": "https://api.github.com/users/rb81/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
3
2024-06-18T08:28:02
2024-06-18T13:04:50
2024-06-18T11:27:32
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? I've seen others complain about similar things but no solid answer. I'm running Ollama on Ubuntu Server with 64GB of RAM (CPU only). Inference time is better than my MacBook Air M1 with 8GB of RAM, but not as much as I would have expected. When looking at the stats, it seems RAM remains unused during inference. Brought this up in the Discord as well. Would sincerely appreciate understanding if this is a bug, something I'm doing/configuring wrong, or something else. Thanks! <img width="749" alt="Screenshot_2024-06-18_at_12 22 20_PM" src="https://github.com/ollama/ollama/assets/48117105/8cb2e56a-ef03-4bc3-ba0a-81bc6360a608"> ^ This is during inference, running qwen2:72b ### OS Linux ### GPU Other ### CPU Intel ### Ollama version 0.1.44
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5111/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5111/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/810
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/810/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/810/comments
https://api.github.com/repos/ollama/ollama/issues/810/events
https://github.com/ollama/ollama/pull/810
1,946,152,677
PR_kwDOJ0Z1Ps5c80BP
810
Update install.sh
{ "login": "vieux", "id": 1032519, "node_id": "MDQ6VXNlcjEwMzI1MTk=", "avatar_url": "https://avatars.githubusercontent.com/u/1032519?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vieux", "html_url": "https://github.com/vieux", "followers_url": "https://api.github.com/users/vieux/followers", "following_url": "https://api.github.com/users/vieux/following{/other_user}", "gists_url": "https://api.github.com/users/vieux/gists{/gist_id}", "starred_url": "https://api.github.com/users/vieux/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vieux/subscriptions", "organizations_url": "https://api.github.com/users/vieux/orgs", "repos_url": "https://api.github.com/users/vieux/repos", "events_url": "https://api.github.com/users/vieux/events{/privacy}", "received_events_url": "https://api.github.com/users/vieux/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
1
2023-10-16T21:44:08
2023-10-16T22:51:05
2023-10-16T22:50:57
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/810", "html_url": "https://github.com/ollama/ollama/pull/810", "diff_url": "https://github.com/ollama/ollama/pull/810.diff", "patch_url": "https://github.com/ollama/ollama/pull/810.patch", "merged_at": "2023-10-16T22:50:57" }
otherwise, the `ARCH` variable is unbound in `*)`
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/810/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/810/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/7754
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7754/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7754/comments
https://api.github.com/repos/ollama/ollama/issues/7754/events
https://github.com/ollama/ollama/issues/7754
2,674,247,413
I_kwDOJ0Z1Ps6fZcb1
7,754
300+mb of ram while idle
{ "login": "Omar-000", "id": 176088407, "node_id": "U_kgDOCn7lVw", "avatar_url": "https://avatars.githubusercontent.com/u/176088407?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Omar-000", "html_url": "https://github.com/Omar-000", "followers_url": "https://api.github.com/users/Omar-000/followers", "following_url": "https://api.github.com/users/Omar-000/following{/other_user}", "gists_url": "https://api.github.com/users/Omar-000/gists{/gist_id}", "starred_url": "https://api.github.com/users/Omar-000/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Omar-000/subscriptions", "organizations_url": "https://api.github.com/users/Omar-000/orgs", "repos_url": "https://api.github.com/users/Omar-000/repos", "events_url": "https://api.github.com/users/Omar-000/events{/privacy}", "received_events_url": "https://api.github.com/users/Omar-000/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
2
2024-11-20T03:49:32
2024-11-28T15:55:16
2024-11-28T15:55:16
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? ![Screenshot_2024-11-20_05-47-48](https://github.com/user-attachments/assets/8e1cdf33-37f3-4d6e-b512-839336408283) I made sure to stop all running models and i restarted my system also ### OS Linux ### GPU AMD, Intel ### CPU Intel ### Ollama version 0.3.12
{ "login": "Omar-000", "id": 176088407, "node_id": "U_kgDOCn7lVw", "avatar_url": "https://avatars.githubusercontent.com/u/176088407?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Omar-000", "html_url": "https://github.com/Omar-000", "followers_url": "https://api.github.com/users/Omar-000/followers", "following_url": "https://api.github.com/users/Omar-000/following{/other_user}", "gists_url": "https://api.github.com/users/Omar-000/gists{/gist_id}", "starred_url": "https://api.github.com/users/Omar-000/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Omar-000/subscriptions", "organizations_url": "https://api.github.com/users/Omar-000/orgs", "repos_url": "https://api.github.com/users/Omar-000/repos", "events_url": "https://api.github.com/users/Omar-000/events{/privacy}", "received_events_url": "https://api.github.com/users/Omar-000/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7754/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7754/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/1995
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1995/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1995/comments
https://api.github.com/repos/ollama/ollama/issues/1995/events
https://github.com/ollama/ollama/issues/1995
2,080,887,536
I_kwDOJ0Z1Ps58B9Lw
1,995
no healthy upstream
{ "login": "vesellov", "id": 5828660, "node_id": "MDQ6VXNlcjU4Mjg2NjA=", "avatar_url": "https://avatars.githubusercontent.com/u/5828660?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vesellov", "html_url": "https://github.com/vesellov", "followers_url": "https://api.github.com/users/vesellov/followers", "following_url": "https://api.github.com/users/vesellov/following{/other_user}", "gists_url": "https://api.github.com/users/vesellov/gists{/gist_id}", "starred_url": "https://api.github.com/users/vesellov/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vesellov/subscriptions", "organizations_url": "https://api.github.com/users/vesellov/orgs", "repos_url": "https://api.github.com/users/vesellov/repos", "events_url": "https://api.github.com/users/vesellov/events{/privacy}", "received_events_url": "https://api.github.com/users/vesellov/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
4
2024-01-14T20:33:59
2024-01-14T21:46:11
2024-01-14T21:44:47
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Hello Team. Great tool you built. Thank you for that! I am getting `no healthy upstream` when trying to open the ollama.ai web site... probably too many people loves Ollama today :heart:
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1995/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1995/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2337
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2337/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2337/comments
https://api.github.com/repos/ollama/ollama/issues/2337/events
https://github.com/ollama/ollama/issues/2337
2,116,457,742
I_kwDOJ0Z1Ps5-JpUO
2,337
Support model allenai/OLMo-7B
{ "login": "o-agassizii", "id": 110026216, "node_id": "U_kgDOBo7d6A", "avatar_url": "https://avatars.githubusercontent.com/u/110026216?v=4", "gravatar_id": "", "url": "https://api.github.com/users/o-agassizii", "html_url": "https://github.com/o-agassizii", "followers_url": "https://api.github.com/users/o-agassizii/followers", "following_url": "https://api.github.com/users/o-agassizii/following{/other_user}", "gists_url": "https://api.github.com/users/o-agassizii/gists{/gist_id}", "starred_url": "https://api.github.com/users/o-agassizii/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/o-agassizii/subscriptions", "organizations_url": "https://api.github.com/users/o-agassizii/orgs", "repos_url": "https://api.github.com/users/o-agassizii/repos", "events_url": "https://api.github.com/users/o-agassizii/events{/privacy}", "received_events_url": "https://api.github.com/users/o-agassizii/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
open
false
null
[]
null
23
2024-02-03T12:10:31
2025-01-15T08:02:43
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
If possible, could support for this model be added to ollama? https://huggingface.co/allenai/OLMo-7B
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2337/reactions", "total_count": 9, "+1": 9, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2337/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/8578
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8578/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8578/comments
https://api.github.com/repos/ollama/ollama/issues/8578/events
https://github.com/ollama/ollama/pull/8578
2,810,900,387
PR_kwDOJ0Z1Ps6I-epn
8,578
Enhance install.sh with download resumption and improvements
{ "login": "navidhasanitabar", "id": 35690837, "node_id": "MDQ6VXNlcjM1NjkwODM3", "avatar_url": "https://avatars.githubusercontent.com/u/35690837?v=4", "gravatar_id": "", "url": "https://api.github.com/users/navidhasanitabar", "html_url": "https://github.com/navidhasanitabar", "followers_url": "https://api.github.com/users/navidhasanitabar/followers", "following_url": "https://api.github.com/users/navidhasanitabar/following{/other_user}", "gists_url": "https://api.github.com/users/navidhasanitabar/gists{/gist_id}", "starred_url": "https://api.github.com/users/navidhasanitabar/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/navidhasanitabar/subscriptions", "organizations_url": "https://api.github.com/users/navidhasanitabar/orgs", "repos_url": "https://api.github.com/users/navidhasanitabar/repos", "events_url": "https://api.github.com/users/navidhasanitabar/events{/privacy}", "received_events_url": "https://api.github.com/users/navidhasanitabar/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2025-01-25T10:01:23
2025-01-25T10:11:42
2025-01-25T10:11:32
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
true
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/8578", "html_url": "https://github.com/ollama/ollama/pull/8578", "diff_url": "https://github.com/ollama/ollama/pull/8578.diff", "patch_url": "https://github.com/ollama/ollama/pull/8578.patch", "merged_at": null }
Added support for resuming interrupted downloads.
{ "login": "navidhasanitabar", "id": 35690837, "node_id": "MDQ6VXNlcjM1NjkwODM3", "avatar_url": "https://avatars.githubusercontent.com/u/35690837?v=4", "gravatar_id": "", "url": "https://api.github.com/users/navidhasanitabar", "html_url": "https://github.com/navidhasanitabar", "followers_url": "https://api.github.com/users/navidhasanitabar/followers", "following_url": "https://api.github.com/users/navidhasanitabar/following{/other_user}", "gists_url": "https://api.github.com/users/navidhasanitabar/gists{/gist_id}", "starred_url": "https://api.github.com/users/navidhasanitabar/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/navidhasanitabar/subscriptions", "organizations_url": "https://api.github.com/users/navidhasanitabar/orgs", "repos_url": "https://api.github.com/users/navidhasanitabar/repos", "events_url": "https://api.github.com/users/navidhasanitabar/events{/privacy}", "received_events_url": "https://api.github.com/users/navidhasanitabar/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8578/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8578/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/1370
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1370/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1370/comments
https://api.github.com/repos/ollama/ollama/issues/1370/events
https://github.com/ollama/ollama/issues/1370
2,023,050,319
I_kwDOJ0Z1Ps54lUxP
1,370
Add support for NeuralHermes-2.5-Mistral-7B
{ "login": "Aspie96", "id": 13873909, "node_id": "MDQ6VXNlcjEzODczOTA5", "avatar_url": "https://avatars.githubusercontent.com/u/13873909?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Aspie96", "html_url": "https://github.com/Aspie96", "followers_url": "https://api.github.com/users/Aspie96/followers", "following_url": "https://api.github.com/users/Aspie96/following{/other_user}", "gists_url": "https://api.github.com/users/Aspie96/gists{/gist_id}", "starred_url": "https://api.github.com/users/Aspie96/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Aspie96/subscriptions", "organizations_url": "https://api.github.com/users/Aspie96/orgs", "repos_url": "https://api.github.com/users/Aspie96/repos", "events_url": "https://api.github.com/users/Aspie96/events{/privacy}", "received_events_url": "https://api.github.com/users/Aspie96/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
closed
false
null
[]
null
1
2023-12-04T05:19:31
2024-09-04T03:24:34
2024-09-04T03:24:34
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Yes, it's yet another Mistral-based chatbot. Would you consider adding support for [NeuralHermes-2.5-Mistral-7B](https://huggingface.co/mlabonne/NeuralHermes-2.5-Mistral-7B)? Thank you very much!
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1370/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1370/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/4178
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4178/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4178/comments
https://api.github.com/repos/ollama/ollama/issues/4178/events
https://github.com/ollama/ollama/issues/4178
2,279,705,726
I_kwDOJ0Z1Ps6H4Yx-
4,178
pull starcoder2:7b-fp16 results in error EOF
{ "login": "MarkWard0110", "id": 90335263, "node_id": "MDQ6VXNlcjkwMzM1MjYz", "avatar_url": "https://avatars.githubusercontent.com/u/90335263?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MarkWard0110", "html_url": "https://github.com/MarkWard0110", "followers_url": "https://api.github.com/users/MarkWard0110/followers", "following_url": "https://api.github.com/users/MarkWard0110/following{/other_user}", "gists_url": "https://api.github.com/users/MarkWard0110/gists{/gist_id}", "starred_url": "https://api.github.com/users/MarkWard0110/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MarkWard0110/subscriptions", "organizations_url": "https://api.github.com/users/MarkWard0110/orgs", "repos_url": "https://api.github.com/users/MarkWard0110/repos", "events_url": "https://api.github.com/users/MarkWard0110/events{/privacy}", "received_events_url": "https://api.github.com/users/MarkWard0110/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
1
2024-05-05T20:03:07
2024-05-06T18:33:49
2024-05-06T18:33:48
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? `ollama pull starcoder2:7b-fp16` when pulling manifest outputs `Error: EOF` ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version 0.1.33
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4178/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4178/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/7930
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7930/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7930/comments
https://api.github.com/repos/ollama/ollama/issues/7930/events
https://github.com/ollama/ollama/issues/7930
2,717,321,118
I_kwDOJ0Z1Ps6h9wee
7,930
failed to decode batch: could not find a kv cache slot
{ "login": "wangpf09", "id": 39894166, "node_id": "MDQ6VXNlcjM5ODk0MTY2", "avatar_url": "https://avatars.githubusercontent.com/u/39894166?v=4", "gravatar_id": "", "url": "https://api.github.com/users/wangpf09", "html_url": "https://github.com/wangpf09", "followers_url": "https://api.github.com/users/wangpf09/followers", "following_url": "https://api.github.com/users/wangpf09/following{/other_user}", "gists_url": "https://api.github.com/users/wangpf09/gists{/gist_id}", "starred_url": "https://api.github.com/users/wangpf09/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/wangpf09/subscriptions", "organizations_url": "https://api.github.com/users/wangpf09/orgs", "repos_url": "https://api.github.com/users/wangpf09/repos", "events_url": "https://api.github.com/users/wangpf09/events{/privacy}", "received_events_url": "https://api.github.com/users/wangpf09/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
2
2024-12-04T10:50:35
2024-12-05T12:10:57
2024-12-04T11:38:34
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? i run the ollama 0.4.6、0.4.7 and the source code,all have this error and i used apple m2 ``` time=2024-12-04T18:45:31.343+08:00 level=WARN source=runner.go:129 msg="truncating input prompt" limit=2048 prompt=2052 keep=5 new=2048 panic: failed to decode batch: could not find a kv cache slot goroutine 36 [running]: main.(*Server).run(0x1400014a120, {0x1025f6d88, 0x140001000a0}) github.com/ollama/ollama/llama/runner/runner.go:344 +0x1e0 created by main.main in goroutine 1 github.com/ollama/ollama/llama/runner/runner.go:978 +0xb30 ``` ### OS macOS ### GPU Apple ### CPU Apple ### Ollama version 0.4.6、0.4.7
{ "login": "wangpf09", "id": 39894166, "node_id": "MDQ6VXNlcjM5ODk0MTY2", "avatar_url": "https://avatars.githubusercontent.com/u/39894166?v=4", "gravatar_id": "", "url": "https://api.github.com/users/wangpf09", "html_url": "https://github.com/wangpf09", "followers_url": "https://api.github.com/users/wangpf09/followers", "following_url": "https://api.github.com/users/wangpf09/following{/other_user}", "gists_url": "https://api.github.com/users/wangpf09/gists{/gist_id}", "starred_url": "https://api.github.com/users/wangpf09/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/wangpf09/subscriptions", "organizations_url": "https://api.github.com/users/wangpf09/orgs", "repos_url": "https://api.github.com/users/wangpf09/repos", "events_url": "https://api.github.com/users/wangpf09/events{/privacy}", "received_events_url": "https://api.github.com/users/wangpf09/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7930/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7930/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/3548
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3548/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3548/comments
https://api.github.com/repos/ollama/ollama/issues/3548/events
https://github.com/ollama/ollama/pull/3548
2,232,603,496
PR_kwDOJ0Z1Ps5sFh2S
3,548
build.go: introduce a friendlier way to build Ollama
{ "login": "bmizerany", "id": 46, "node_id": "MDQ6VXNlcjQ2", "avatar_url": "https://avatars.githubusercontent.com/u/46?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bmizerany", "html_url": "https://github.com/bmizerany", "followers_url": "https://api.github.com/users/bmizerany/followers", "following_url": "https://api.github.com/users/bmizerany/following{/other_user}", "gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}", "starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions", "organizations_url": "https://api.github.com/users/bmizerany/orgs", "repos_url": "https://api.github.com/users/bmizerany/repos", "events_url": "https://api.github.com/users/bmizerany/events{/privacy}", "received_events_url": "https://api.github.com/users/bmizerany/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-04-09T04:58:39
2024-04-19T20:31:25
2024-04-09T21:18:47
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/3548", "html_url": "https://github.com/ollama/ollama/pull/3548", "diff_url": "https://github.com/ollama/ollama/pull/3548.diff", "patch_url": "https://github.com/ollama/ollama/pull/3548.patch", "merged_at": "2024-04-09T21:18:47" }
This commit introduces a more friendly way to build Ollama dependencies and the binary without abusing `go generate` and removing the unnecessary extra steps it brings with it. This script also provides nicer feedback to the user about what is happening during the build process. At the end, it prints a helpful message to the user about what to do next (e.g. run the new local Ollama). This addresses a few other issues: 1. We can now introduce new code generation tools without forcing rebuilds of llama.cpp 1. We can now introduce new build flags/features (e.g. `rangefunc`), et al without hurting developer on-boarding experience 2. We also have more power and flexibility with how/where we layout our directory structure in the future. How to test: ; git clone https://github.com/ollama/ollama ; go run build.go Test that `./ollama serve` runs, and then try: ; go run build.go That should skip the generate step, then try: ; go run build.go -f That should rebuild llama.cpp again.
{ "login": "bmizerany", "id": 46, "node_id": "MDQ6VXNlcjQ2", "avatar_url": "https://avatars.githubusercontent.com/u/46?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bmizerany", "html_url": "https://github.com/bmizerany", "followers_url": "https://api.github.com/users/bmizerany/followers", "following_url": "https://api.github.com/users/bmizerany/following{/other_user}", "gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}", "starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions", "organizations_url": "https://api.github.com/users/bmizerany/orgs", "repos_url": "https://api.github.com/users/bmizerany/repos", "events_url": "https://api.github.com/users/bmizerany/events{/privacy}", "received_events_url": "https://api.github.com/users/bmizerany/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3548/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3548/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/3621
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3621/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3621/comments
https://api.github.com/repos/ollama/ollama/issues/3621/events
https://github.com/ollama/ollama/pull/3621
2,241,141,554
PR_kwDOJ0Z1Ps5si7Ea
3,621
Update README.md with StreamDeploy
{ "login": "jl-codes", "id": 19557526, "node_id": "MDQ6VXNlcjE5NTU3NTI2", "avatar_url": "https://avatars.githubusercontent.com/u/19557526?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jl-codes", "html_url": "https://github.com/jl-codes", "followers_url": "https://api.github.com/users/jl-codes/followers", "following_url": "https://api.github.com/users/jl-codes/following{/other_user}", "gists_url": "https://api.github.com/users/jl-codes/gists{/gist_id}", "starred_url": "https://api.github.com/users/jl-codes/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jl-codes/subscriptions", "organizations_url": "https://api.github.com/users/jl-codes/orgs", "repos_url": "https://api.github.com/users/jl-codes/repos", "events_url": "https://api.github.com/users/jl-codes/events{/privacy}", "received_events_url": "https://api.github.com/users/jl-codes/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
3
2024-04-13T00:16:41
2024-05-06T18:14:42
2024-05-06T18:14:41
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/3621", "html_url": "https://github.com/ollama/ollama/pull/3621", "diff_url": "https://github.com/ollama/ollama/pull/3621.diff", "patch_url": "https://github.com/ollama/ollama/pull/3621.patch", "merged_at": "2024-05-06T18:14:41" }
null
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3621/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3621/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/4500
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4500/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4500/comments
https://api.github.com/repos/ollama/ollama/issues/4500/events
https://github.com/ollama/ollama/issues/4500
2,302,666,704
I_kwDOJ0Z1Ps6JP-fQ
4,500
High CPU Usage and Model Stoppage Issue in Ollama on Linux CentOS7 Without GPU
{ "login": "sirfuwh", "id": 58595497, "node_id": "MDQ6VXNlcjU4NTk1NDk3", "avatar_url": "https://avatars.githubusercontent.com/u/58595497?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sirfuwh", "html_url": "https://github.com/sirfuwh", "followers_url": "https://api.github.com/users/sirfuwh/followers", "following_url": "https://api.github.com/users/sirfuwh/following{/other_user}", "gists_url": "https://api.github.com/users/sirfuwh/gists{/gist_id}", "starred_url": "https://api.github.com/users/sirfuwh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sirfuwh/subscriptions", "organizations_url": "https://api.github.com/users/sirfuwh/orgs", "repos_url": "https://api.github.com/users/sirfuwh/repos", "events_url": "https://api.github.com/users/sirfuwh/events{/privacy}", "received_events_url": "https://api.github.com/users/sirfuwh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
1
2024-05-17T12:48:45
2024-05-17T15:29:53
2024-05-17T15:29:53
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? I'm using Ollama as the framework for large AI models, with open-webUI or anythingLLM as the frontend. My machine is running Linux CentOS7 with 32GB of memory and a 24-core CPU, but no GPU. When running models like phi3 or others in Ollama, the CPU usage is around 1200% (htop shows 12 cores fully utilized). After a few rounds of conversation, which might take about 5-6 minutes, the CPU usage drops to 100% (htop shows one random core fully utilized), and it stays that way until I stop Ollama. At this point, the model stops generating text. Not only does the current model stop working, but switching to other models downloaded in Ollama also has no effect, although the Linux system itself does not crash. It seems to be an issue with Ollama. And `dmsg` and `journalctl -u ollama` no special hits . ![ollama_100percent_usage](https://github.com/ollama/ollama/assets/58595497/1aca8c62-232c-4d10-9ea2-da58bfeeb12f) thanks! ### OS Linux ### GPU _No response_ ### CPU Intel ### Ollama version 0.1.30
{ "login": "sirfuwh", "id": 58595497, "node_id": "MDQ6VXNlcjU4NTk1NDk3", "avatar_url": "https://avatars.githubusercontent.com/u/58595497?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sirfuwh", "html_url": "https://github.com/sirfuwh", "followers_url": "https://api.github.com/users/sirfuwh/followers", "following_url": "https://api.github.com/users/sirfuwh/following{/other_user}", "gists_url": "https://api.github.com/users/sirfuwh/gists{/gist_id}", "starred_url": "https://api.github.com/users/sirfuwh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sirfuwh/subscriptions", "organizations_url": "https://api.github.com/users/sirfuwh/orgs", "repos_url": "https://api.github.com/users/sirfuwh/repos", "events_url": "https://api.github.com/users/sirfuwh/events{/privacy}", "received_events_url": "https://api.github.com/users/sirfuwh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4500/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4500/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/410
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/410/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/410/comments
https://api.github.com/repos/ollama/ollama/issues/410/events
https://github.com/ollama/ollama/issues/410
1,867,133,843
I_kwDOJ0Z1Ps5vSjOT
410
Code llama 34b instruct?
{ "login": "petergeneric", "id": 870655, "node_id": "MDQ6VXNlcjg3MDY1NQ==", "avatar_url": "https://avatars.githubusercontent.com/u/870655?v=4", "gravatar_id": "", "url": "https://api.github.com/users/petergeneric", "html_url": "https://github.com/petergeneric", "followers_url": "https://api.github.com/users/petergeneric/followers", "following_url": "https://api.github.com/users/petergeneric/following{/other_user}", "gists_url": "https://api.github.com/users/petergeneric/gists{/gist_id}", "starred_url": "https://api.github.com/users/petergeneric/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/petergeneric/subscriptions", "organizations_url": "https://api.github.com/users/petergeneric/orgs", "repos_url": "https://api.github.com/users/petergeneric/repos", "events_url": "https://api.github.com/users/petergeneric/events{/privacy}", "received_events_url": "https://api.github.com/users/petergeneric/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
1
2023-08-25T14:13:20
2023-08-25T19:09:14
2023-08-25T19:09:13
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Is there any chance of getting the larger 13b and 34b codellama models available? The 7b models are nice but a bit limited
{ "login": "technovangelist", "id": 633681, "node_id": "MDQ6VXNlcjYzMzY4MQ==", "avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4", "gravatar_id": "", "url": "https://api.github.com/users/technovangelist", "html_url": "https://github.com/technovangelist", "followers_url": "https://api.github.com/users/technovangelist/followers", "following_url": "https://api.github.com/users/technovangelist/following{/other_user}", "gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}", "starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions", "organizations_url": "https://api.github.com/users/technovangelist/orgs", "repos_url": "https://api.github.com/users/technovangelist/repos", "events_url": "https://api.github.com/users/technovangelist/events{/privacy}", "received_events_url": "https://api.github.com/users/technovangelist/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/410/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/410/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/554
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/554/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/554/comments
https://api.github.com/repos/ollama/ollama/issues/554/events
https://github.com/ollama/ollama/pull/554
1,903,358,659
PR_kwDOJ0Z1Ps5asq9K
554
fix path for windows
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2023-09-19T16:36:53
2023-09-19T16:42:13
2023-09-19T16:42:12
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/554", "html_url": "https://github.com/ollama/ollama/pull/554", "diff_url": "https://github.com/ollama/ollama/pull/554.diff", "patch_url": "https://github.com/ollama/ollama/pull/554.patch", "merged_at": "2023-09-19T16:42:12" }
null
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/554/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/554/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/7080
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7080/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7080/comments
https://api.github.com/repos/ollama/ollama/issues/7080/events
https://github.com/ollama/ollama/issues/7080
2,562,037,576
I_kwDOJ0Z1Ps6YtZdI
7,080
Support for NVLM
{ "login": "mitar", "id": 585279, "node_id": "MDQ6VXNlcjU4NTI3OQ==", "avatar_url": "https://avatars.githubusercontent.com/u/585279?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mitar", "html_url": "https://github.com/mitar", "followers_url": "https://api.github.com/users/mitar/followers", "following_url": "https://api.github.com/users/mitar/following{/other_user}", "gists_url": "https://api.github.com/users/mitar/gists{/gist_id}", "starred_url": "https://api.github.com/users/mitar/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mitar/subscriptions", "organizations_url": "https://api.github.com/users/mitar/orgs", "repos_url": "https://api.github.com/users/mitar/repos", "events_url": "https://api.github.com/users/mitar/events{/privacy}", "received_events_url": "https://api.github.com/users/mitar/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
open
false
null
[]
null
7
2024-10-02T15:40:33
2024-10-09T19:23:40
null
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
NVLM is model from Nvidia: https://nvlm-project.github.io/
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7080/reactions", "total_count": 33, "+1": 33, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7080/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/5391
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5391/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5391/comments
https://api.github.com/repos/ollama/ollama/issues/5391/events
https://github.com/ollama/ollama/issues/5391
2,382,244,315
I_kwDOJ0Z1Ps6N_inb
5,391
How do I find LLMs in the GitHub repository?
{ "login": "qzc438", "id": 61488260, "node_id": "MDQ6VXNlcjYxNDg4MjYw", "avatar_url": "https://avatars.githubusercontent.com/u/61488260?v=4", "gravatar_id": "", "url": "https://api.github.com/users/qzc438", "html_url": "https://github.com/qzc438", "followers_url": "https://api.github.com/users/qzc438/followers", "following_url": "https://api.github.com/users/qzc438/following{/other_user}", "gists_url": "https://api.github.com/users/qzc438/gists{/gist_id}", "starred_url": "https://api.github.com/users/qzc438/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/qzc438/subscriptions", "organizations_url": "https://api.github.com/users/qzc438/orgs", "repos_url": "https://api.github.com/users/qzc438/repos", "events_url": "https://api.github.com/users/qzc438/events{/privacy}", "received_events_url": "https://api.github.com/users/qzc438/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
null
[]
null
2
2024-06-30T12:59:04
2024-07-02T13:43:28
2024-07-02T13:43:28
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
As the title describes, how do I find LLMs in the GitHub repository? Does the GitHub repository have a specific location for storing LLMs?
{ "login": "qzc438", "id": 61488260, "node_id": "MDQ6VXNlcjYxNDg4MjYw", "avatar_url": "https://avatars.githubusercontent.com/u/61488260?v=4", "gravatar_id": "", "url": "https://api.github.com/users/qzc438", "html_url": "https://github.com/qzc438", "followers_url": "https://api.github.com/users/qzc438/followers", "following_url": "https://api.github.com/users/qzc438/following{/other_user}", "gists_url": "https://api.github.com/users/qzc438/gists{/gist_id}", "starred_url": "https://api.github.com/users/qzc438/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/qzc438/subscriptions", "organizations_url": "https://api.github.com/users/qzc438/orgs", "repos_url": "https://api.github.com/users/qzc438/repos", "events_url": "https://api.github.com/users/qzc438/events{/privacy}", "received_events_url": "https://api.github.com/users/qzc438/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5391/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5391/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/1242
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1242/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1242/comments
https://api.github.com/repos/ollama/ollama/issues/1242/events
https://github.com/ollama/ollama/issues/1242
2,006,632,608
I_kwDOJ0Z1Ps53msig
1,242
Mac ollama install and run results in template error
{ "login": "mkontsek", "id": 2892242, "node_id": "MDQ6VXNlcjI4OTIyNDI=", "avatar_url": "https://avatars.githubusercontent.com/u/2892242?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mkontsek", "html_url": "https://github.com/mkontsek", "followers_url": "https://api.github.com/users/mkontsek/followers", "following_url": "https://api.github.com/users/mkontsek/following{/other_user}", "gists_url": "https://api.github.com/users/mkontsek/gists{/gist_id}", "starred_url": "https://api.github.com/users/mkontsek/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mkontsek/subscriptions", "organizations_url": "https://api.github.com/users/mkontsek/orgs", "repos_url": "https://api.github.com/users/mkontsek/repos", "events_url": "https://api.github.com/users/mkontsek/events{/privacy}", "received_events_url": "https://api.github.com/users/mkontsek/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
2
2023-11-22T15:49:13
2023-11-22T17:14:13
2023-11-22T17:14:12
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
OS: macOS 14.1.1 (23B81) RAM: 32GB Steps to reproduce: 1. Download https://ollama.ai/download/Ollama-darwin.zip 2. Open zip 3. Move app to Applications 4. Install model from GUI prompt 5. Open terminal and run `ollama run llama2` Observed: Error: template: :2:11: executing "" at <.Context>: can't evaluate field Context in type struct { First bool; System string; Prompt string } Expected: No error, working prompt.
{ "login": "mkontsek", "id": 2892242, "node_id": "MDQ6VXNlcjI4OTIyNDI=", "avatar_url": "https://avatars.githubusercontent.com/u/2892242?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mkontsek", "html_url": "https://github.com/mkontsek", "followers_url": "https://api.github.com/users/mkontsek/followers", "following_url": "https://api.github.com/users/mkontsek/following{/other_user}", "gists_url": "https://api.github.com/users/mkontsek/gists{/gist_id}", "starred_url": "https://api.github.com/users/mkontsek/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mkontsek/subscriptions", "organizations_url": "https://api.github.com/users/mkontsek/orgs", "repos_url": "https://api.github.com/users/mkontsek/repos", "events_url": "https://api.github.com/users/mkontsek/events{/privacy}", "received_events_url": "https://api.github.com/users/mkontsek/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1242/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1242/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2533
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2533/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2533/comments
https://api.github.com/repos/ollama/ollama/issues/2533/events
https://github.com/ollama/ollama/issues/2533
2,137,783,483
I_kwDOJ0Z1Ps5_a_y7
2,533
Setting Query vector Size
{ "login": "stealthier-ai", "id": 99160607, "node_id": "U_kgDOBekSHw", "avatar_url": "https://avatars.githubusercontent.com/u/99160607?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stealthier-ai", "html_url": "https://github.com/stealthier-ai", "followers_url": "https://api.github.com/users/stealthier-ai/followers", "following_url": "https://api.github.com/users/stealthier-ai/following{/other_user}", "gists_url": "https://api.github.com/users/stealthier-ai/gists{/gist_id}", "starred_url": "https://api.github.com/users/stealthier-ai/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stealthier-ai/subscriptions", "organizations_url": "https://api.github.com/users/stealthier-ai/orgs", "repos_url": "https://api.github.com/users/stealthier-ai/repos", "events_url": "https://api.github.com/users/stealthier-ai/events{/privacy}", "received_events_url": "https://api.github.com/users/stealthier-ai/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396220, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA", "url": "https://api.github.com/repos/ollama/ollama/labels/question", "name": "question", "color": "d876e3", "default": true, "description": "General questions" } ]
open
false
null
[]
null
0
2024-02-16T02:55:48
2024-03-11T19:13:37
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I used Ollama Embeddings through langchain with one of the models to embed a large number of documents. The LLM I am using is a multi-lingual model and has already been tested on a significant document set in solely English. When I query the vector store through LanceDB I receive the error "ValueError: Query vector size 8192 does not match index column size 384." I am using the exact same embedding model but now he query vector is 8192. How do I set the vector size to make it match the index column size with Ollama embedding?
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2533/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2533/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/1096
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1096/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1096/comments
https://api.github.com/repos/ollama/ollama/issues/1096/events
https://github.com/ollama/ollama/issues/1096
1,989,239,328
I_kwDOJ0Z1Ps52kWIg
1,096
how to training my local data use ollama on k8s pod
{ "login": "xinmans", "id": 2713008, "node_id": "MDQ6VXNlcjI3MTMwMDg=", "avatar_url": "https://avatars.githubusercontent.com/u/2713008?v=4", "gravatar_id": "", "url": "https://api.github.com/users/xinmans", "html_url": "https://github.com/xinmans", "followers_url": "https://api.github.com/users/xinmans/followers", "following_url": "https://api.github.com/users/xinmans/following{/other_user}", "gists_url": "https://api.github.com/users/xinmans/gists{/gist_id}", "starred_url": "https://api.github.com/users/xinmans/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/xinmans/subscriptions", "organizations_url": "https://api.github.com/users/xinmans/orgs", "repos_url": "https://api.github.com/users/xinmans/repos", "events_url": "https://api.github.com/users/xinmans/events{/privacy}", "received_events_url": "https://api.github.com/users/xinmans/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
1
2023-11-12T04:54:52
2023-12-04T23:27:29
2023-12-04T23:27:29
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
null
{ "login": "technovangelist", "id": 633681, "node_id": "MDQ6VXNlcjYzMzY4MQ==", "avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4", "gravatar_id": "", "url": "https://api.github.com/users/technovangelist", "html_url": "https://github.com/technovangelist", "followers_url": "https://api.github.com/users/technovangelist/followers", "following_url": "https://api.github.com/users/technovangelist/following{/other_user}", "gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}", "starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions", "organizations_url": "https://api.github.com/users/technovangelist/orgs", "repos_url": "https://api.github.com/users/technovangelist/repos", "events_url": "https://api.github.com/users/technovangelist/events{/privacy}", "received_events_url": "https://api.github.com/users/technovangelist/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1096/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1096/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/230
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/230/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/230/comments
https://api.github.com/repos/ollama/ollama/issues/230/events
https://github.com/ollama/ollama/pull/230
1,825,023,018
PR_kwDOJ0Z1Ps5WlJP8
230
update model file docs
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2023-07-27T19:16:32
2023-07-28T14:33:54
2023-07-28T14:33:53
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/230", "html_url": "https://github.com/ollama/ollama/pull/230", "diff_url": "https://github.com/ollama/ollama/pull/230.diff", "patch_url": "https://github.com/ollama/ollama/pull/230.patch", "merged_at": "2023-07-28T14:33:53" }
null
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/230/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/230/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/2757
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2757/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2757/comments
https://api.github.com/repos/ollama/ollama/issues/2757/events
https://github.com/ollama/ollama/issues/2757
2,153,200,472
I_kwDOJ0Z1Ps6AVztY
2,757
openAI TTS cannot read all replies
{ "login": "samqin123", "id": 103937568, "node_id": "U_kgDOBjH2IA", "avatar_url": "https://avatars.githubusercontent.com/u/103937568?v=4", "gravatar_id": "", "url": "https://api.github.com/users/samqin123", "html_url": "https://github.com/samqin123", "followers_url": "https://api.github.com/users/samqin123/followers", "following_url": "https://api.github.com/users/samqin123/following{/other_user}", "gists_url": "https://api.github.com/users/samqin123/gists{/gist_id}", "starred_url": "https://api.github.com/users/samqin123/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/samqin123/subscriptions", "organizations_url": "https://api.github.com/users/samqin123/orgs", "repos_url": "https://api.github.com/users/samqin123/repos", "events_url": "https://api.github.com/users/samqin123/events{/privacy}", "received_events_url": "https://api.github.com/users/samqin123/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
1
2024-02-26T03:56:55
2024-03-12T04:32:50
2024-03-12T04:32:50
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
[brief] Mac M1 /Docker-compose deployed [description] after switched on TTS configuaration and choose OPENAI tts engineen, the replies can read automatically upon chat replies show. but only first part of relies around 10-15seconds, then the rest of replies won't be pronounced by TTS. [expection] TTS shall read all replies instead of part.
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2757/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2757/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/549
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/549/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/549/comments
https://api.github.com/repos/ollama/ollama/issues/549/events
https://github.com/ollama/ollama/issues/549
1,900,070,108
I_kwDOJ0Z1Ps5xQMTc
549
Models sometimes prompt themselves
{ "login": "txstc55", "id": 13168188, "node_id": "MDQ6VXNlcjEzMTY4MTg4", "avatar_url": "https://avatars.githubusercontent.com/u/13168188?v=4", "gravatar_id": "", "url": "https://api.github.com/users/txstc55", "html_url": "https://github.com/txstc55", "followers_url": "https://api.github.com/users/txstc55/followers", "following_url": "https://api.github.com/users/txstc55/following{/other_user}", "gists_url": "https://api.github.com/users/txstc55/gists{/gist_id}", "starred_url": "https://api.github.com/users/txstc55/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/txstc55/subscriptions", "organizations_url": "https://api.github.com/users/txstc55/orgs", "repos_url": "https://api.github.com/users/txstc55/repos", "events_url": "https://api.github.com/users/txstc55/events{/privacy}", "received_events_url": "https://api.github.com/users/txstc55/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
1
2023-09-18T02:46:45
2023-09-18T16:05:11
2023-09-18T16:05:11
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I'm using uncensored model, the issue happened with uncensored-latest, uncensored 70b and any other uncensored model. Sometimes when I prompt the model, after it made a response, it will prompt itself with something like: ``` ### Input: something that generated by the model itself ### Response: something that is a response to the input ``` This happens randomly and sometimes the ### Input tag becomes ### human tag Any idea why this happens?
{ "login": "txstc55", "id": 13168188, "node_id": "MDQ6VXNlcjEzMTY4MTg4", "avatar_url": "https://avatars.githubusercontent.com/u/13168188?v=4", "gravatar_id": "", "url": "https://api.github.com/users/txstc55", "html_url": "https://github.com/txstc55", "followers_url": "https://api.github.com/users/txstc55/followers", "following_url": "https://api.github.com/users/txstc55/following{/other_user}", "gists_url": "https://api.github.com/users/txstc55/gists{/gist_id}", "starred_url": "https://api.github.com/users/txstc55/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/txstc55/subscriptions", "organizations_url": "https://api.github.com/users/txstc55/orgs", "repos_url": "https://api.github.com/users/txstc55/repos", "events_url": "https://api.github.com/users/txstc55/events{/privacy}", "received_events_url": "https://api.github.com/users/txstc55/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/549/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/549/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6298
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6298/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6298/comments
https://api.github.com/repos/ollama/ollama/issues/6298/events
https://github.com/ollama/ollama/issues/6298
2,459,012,049
I_kwDOJ0Z1Ps6SkYvR
6,298
Install Ollama with Winget on Windows
{ "login": "nikiluk", "id": 6605974, "node_id": "MDQ6VXNlcjY2MDU5NzQ=", "avatar_url": "https://avatars.githubusercontent.com/u/6605974?v=4", "gravatar_id": "", "url": "https://api.github.com/users/nikiluk", "html_url": "https://github.com/nikiluk", "followers_url": "https://api.github.com/users/nikiluk/followers", "following_url": "https://api.github.com/users/nikiluk/following{/other_user}", "gists_url": "https://api.github.com/users/nikiluk/gists{/gist_id}", "starred_url": "https://api.github.com/users/nikiluk/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nikiluk/subscriptions", "organizations_url": "https://api.github.com/users/nikiluk/orgs", "repos_url": "https://api.github.com/users/nikiluk/repos", "events_url": "https://api.github.com/users/nikiluk/events{/privacy}", "received_events_url": "https://api.github.com/users/nikiluk/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" }, { "id": 5860134234, "node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg", "url": "https://api.github.com/repos/ollama/ollama/labels/windows", "name": "windows", "color": "0052CC", "default": false, "description": "" } ]
open
false
null
[]
null
0
2024-08-10T09:47:31
2024-09-05T19:53:06
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Installing Ollama with winget is working perfectly, however not documented in the README.md ![image](https://github.com/user-attachments/assets/99ee03e2-b2e4-47f2-91e0-d12f048592cb)
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6298/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6298/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/2350
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2350/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2350/comments
https://api.github.com/repos/ollama/ollama/issues/2350/events
https://github.com/ollama/ollama/issues/2350
2,117,191,170
I_kwDOJ0Z1Ps5-McYC
2,350
Unable to access ollama server from WSL
{ "login": "TeamDman", "id": 9356891, "node_id": "MDQ6VXNlcjkzNTY4OTE=", "avatar_url": "https://avatars.githubusercontent.com/u/9356891?v=4", "gravatar_id": "", "url": "https://api.github.com/users/TeamDman", "html_url": "https://github.com/TeamDman", "followers_url": "https://api.github.com/users/TeamDman/followers", "following_url": "https://api.github.com/users/TeamDman/following{/other_user}", "gists_url": "https://api.github.com/users/TeamDman/gists{/gist_id}", "starred_url": "https://api.github.com/users/TeamDman/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/TeamDman/subscriptions", "organizations_url": "https://api.github.com/users/TeamDman/orgs", "repos_url": "https://api.github.com/users/TeamDman/repos", "events_url": "https://api.github.com/users/TeamDman/events{/privacy}", "received_events_url": "https://api.github.com/users/TeamDman/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
1
2024-02-04T16:46:56
2024-02-04T16:47:24
2024-02-04T16:47:24
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Running `ollama serve` in WSL should let me visit [http://127.0.0.1:11434/](http://127.0.0.1:11434/) in my Windows browser. This worked the other day, now it doesn't. Using netcat and `python3 -m http.server -b 192.168.1.178 8000` to test other apps/ports, it looks like only Ollama is refusing to participate. Tried running the `ollama serve` command from inside a vscode terminal in a window using WSL, and vscode reported the port as being forwarded, but it still failed. Plus, this shouldn't be necessary since I had it working in just the windows terminal doing the serve command.
{ "login": "TeamDman", "id": 9356891, "node_id": "MDQ6VXNlcjkzNTY4OTE=", "avatar_url": "https://avatars.githubusercontent.com/u/9356891?v=4", "gravatar_id": "", "url": "https://api.github.com/users/TeamDman", "html_url": "https://github.com/TeamDman", "followers_url": "https://api.github.com/users/TeamDman/followers", "following_url": "https://api.github.com/users/TeamDman/following{/other_user}", "gists_url": "https://api.github.com/users/TeamDman/gists{/gist_id}", "starred_url": "https://api.github.com/users/TeamDman/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/TeamDman/subscriptions", "organizations_url": "https://api.github.com/users/TeamDman/orgs", "repos_url": "https://api.github.com/users/TeamDman/repos", "events_url": "https://api.github.com/users/TeamDman/events{/privacy}", "received_events_url": "https://api.github.com/users/TeamDman/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2350/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2350/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/3951
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3951/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3951/comments
https://api.github.com/repos/ollama/ollama/issues/3951/events
https://github.com/ollama/ollama/pull/3951
2,266,319,171
PR_kwDOJ0Z1Ps5t3yY5
3,951
check file type before zip
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-04-26T18:40:12
2024-04-26T21:51:24
2024-04-26T21:51:23
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/3951", "html_url": "https://github.com/ollama/ollama/pull/3951", "diff_url": "https://github.com/ollama/ollama/pull/3951.diff", "patch_url": "https://github.com/ollama/ollama/pull/3951.patch", "merged_at": "2024-04-26T21:51:23" }
also include all json
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3951/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3951/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/313
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/313/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/313/comments
https://api.github.com/repos/ollama/ollama/issues/313/events
https://github.com/ollama/ollama/pull/313
1,843,977,056
PR_kwDOJ0Z1Ps5Xk0NL
313
fix embeddings invalid values
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2023-08-09T20:37:52
2023-08-10T14:17:02
2023-08-10T14:17:01
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/313", "html_url": "https://github.com/ollama/ollama/pull/313", "diff_url": "https://github.com/ollama/ollama/pull/313.diff", "patch_url": "https://github.com/ollama/ollama/pull/313.patch", "merged_at": "2023-08-10T14:17:01" }
Embeddings were occasionally returning invalid values which meant we needed to reload and retry. This fix removes the cache token count which was causing this issue, and improves results. This also matches the llama.cpp example more closely. It also adds the`unsafe.Slice` parsing that Mike suggested in my previous PR, upon further tests this actually works (and it seems faster).
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/313/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/313/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/6491
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6491/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6491/comments
https://api.github.com/repos/ollama/ollama/issues/6491/events
https://github.com/ollama/ollama/issues/6491
2,484,757,549
I_kwDOJ0Z1Ps6UGmQt
6,491
Jamba 1.5 Model
{ "login": "sanjibnarzary", "id": 1001052, "node_id": "MDQ6VXNlcjEwMDEwNTI=", "avatar_url": "https://avatars.githubusercontent.com/u/1001052?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sanjibnarzary", "html_url": "https://github.com/sanjibnarzary", "followers_url": "https://api.github.com/users/sanjibnarzary/followers", "following_url": "https://api.github.com/users/sanjibnarzary/following{/other_user}", "gists_url": "https://api.github.com/users/sanjibnarzary/gists{/gist_id}", "starred_url": "https://api.github.com/users/sanjibnarzary/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sanjibnarzary/subscriptions", "organizations_url": "https://api.github.com/users/sanjibnarzary/orgs", "repos_url": "https://api.github.com/users/sanjibnarzary/repos", "events_url": "https://api.github.com/users/sanjibnarzary/events{/privacy}", "received_events_url": "https://api.github.com/users/sanjibnarzary/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
open
false
null
[]
null
3
2024-08-24T17:56:39
2024-10-01T02:47:43
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Jamba 1.5 Open Model Family: The Most Powerful and Efficient Long Context Models. **Features** **Long context handling**: With a 256K effective context window, the longest in the market, Jamba 1.5 models can improve the quality of key enterprise applications, such as lengthy document summarization and analysis, as well as agentic and RAG workflows **Speed:** Up to 2.5X faster on long contexts and fastest across all context lengths in their size class **Quality:** Jamba 1.5 Mini is the strongest open model in its size class with a score of 46.1 on the Arena Hard benchmark, surpassing larger models like Mixtral 8x22B and Command-R+. Jamba 1.5 Large, with a score of 65.4, outpaces both Llama 3.1 70B and 405B **Multilingual:** In addition to English, the models support Spanish, French, Portuguese, Italian, Dutch, German, Arabic and Hebrew Developer ready: Jamba natively supports structured JSON output, function calling, digesting document objects, and generating citations **Open for builders**: Both models are available for immediate download on Hugging Face (and coming soon to leading frameworks LangChain and LlamaIndex) **Deploy anywhere**: In addition to AI21 Studio, the models are available on cloud partners Google Cloud Vertex AI, Microsoft Azure, and NVIDIA NIM and coming soon to Amazon Bedrock, Databricks Marketplace, Snowflake Cortex, Together.AI as well as for private on-prem and VPC deployment
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6491/reactions", "total_count": 21, "+1": 12, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 5, "rocket": 4, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6491/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/5405
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5405/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5405/comments
https://api.github.com/repos/ollama/ollama/issues/5405/events
https://github.com/ollama/ollama/pull/5405
2,383,473,189
PR_kwDOJ0Z1Ps50DQdU
5,405
server/routers.go: Fix checkNameExists
{ "login": "coolljt0725", "id": 8232360, "node_id": "MDQ6VXNlcjgyMzIzNjA=", "avatar_url": "https://avatars.githubusercontent.com/u/8232360?v=4", "gravatar_id": "", "url": "https://api.github.com/users/coolljt0725", "html_url": "https://github.com/coolljt0725", "followers_url": "https://api.github.com/users/coolljt0725/followers", "following_url": "https://api.github.com/users/coolljt0725/following{/other_user}", "gists_url": "https://api.github.com/users/coolljt0725/gists{/gist_id}", "starred_url": "https://api.github.com/users/coolljt0725/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/coolljt0725/subscriptions", "organizations_url": "https://api.github.com/users/coolljt0725/orgs", "repos_url": "https://api.github.com/users/coolljt0725/repos", "events_url": "https://api.github.com/users/coolljt0725/events{/privacy}", "received_events_url": "https://api.github.com/users/coolljt0725/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
3
2024-07-01T10:34:21
2024-07-31T01:25:08
2024-07-30T23:29:29
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/5405", "html_url": "https://github.com/ollama/ollama/pull/5405", "diff_url": "https://github.com/ollama/ollama/pull/5405.diff", "patch_url": "https://github.com/ollama/ollama/pull/5405.patch", "merged_at": null }
When copy a model with a existed name, it suppose to error out, but it success. The old `checkNameExists` only report exist model name in case the name is different, but they are same when both of them convert to upper case, it means `TEST` and `test` is a same name, but `test` and `test` is not a same name. ``` # ollama cp tinyllama tinyllama copied 'tinyllama' to 'tinyllama' ``` And we should not check name existence on create and pull, because it suppose to replace some layers of the existing model
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5405/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5405/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/2766
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2766/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2766/comments
https://api.github.com/repos/ollama/ollama/issues/2766/events
https://github.com/ollama/ollama/issues/2766
2,154,640,079
I_kwDOJ0Z1Ps6AbTLP
2,766
Some issues on Windows
{ "login": "vrubzov1957", "id": 54937209, "node_id": "MDQ6VXNlcjU0OTM3MjA5", "avatar_url": "https://avatars.githubusercontent.com/u/54937209?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vrubzov1957", "html_url": "https://github.com/vrubzov1957", "followers_url": "https://api.github.com/users/vrubzov1957/followers", "following_url": "https://api.github.com/users/vrubzov1957/following{/other_user}", "gists_url": "https://api.github.com/users/vrubzov1957/gists{/gist_id}", "starred_url": "https://api.github.com/users/vrubzov1957/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vrubzov1957/subscriptions", "organizations_url": "https://api.github.com/users/vrubzov1957/orgs", "repos_url": "https://api.github.com/users/vrubzov1957/repos", "events_url": "https://api.github.com/users/vrubzov1957/events{/privacy}", "received_events_url": "https://api.github.com/users/vrubzov1957/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
3
2024-02-26T16:49:23
2024-02-27T16:39:59
2024-02-27T16:39:58
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Guys, have some issues with Ollama on Windows (11 + WSL2). Ollama version - was downloaded 24.02.2024 from off-site, version for Windows. 1. Ollama models works on CPU, not on GPU (Nvidia 1080 11G). Once upon a time it somehow run on the video card - but the pattern of how and when it works could not be found out, looks that select GPU/CPU randomly. On CPU good/big models works very slow. 2. After restart of Windows Ollama server not up. Need manually open cmd-terminal, type "ollama serve", and hold terminal-windows opened.
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2766/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2766/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2595
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2595/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2595/comments
https://api.github.com/repos/ollama/ollama/issues/2595/events
https://github.com/ollama/ollama/issues/2595
2,142,735,480
I_kwDOJ0Z1Ps5_t4x4
2,595
Conversation context no longer taken into account?
{ "login": "dictoon", "id": 321290, "node_id": "MDQ6VXNlcjMyMTI5MA==", "avatar_url": "https://avatars.githubusercontent.com/u/321290?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dictoon", "html_url": "https://github.com/dictoon", "followers_url": "https://api.github.com/users/dictoon/followers", "following_url": "https://api.github.com/users/dictoon/following{/other_user}", "gists_url": "https://api.github.com/users/dictoon/gists{/gist_id}", "starred_url": "https://api.github.com/users/dictoon/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dictoon/subscriptions", "organizations_url": "https://api.github.com/users/dictoon/orgs", "repos_url": "https://api.github.com/users/dictoon/repos", "events_url": "https://api.github.com/users/dictoon/events{/privacy}", "received_events_url": "https://api.github.com/users/dictoon/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
14
2024-02-19T16:19:03
2024-08-28T19:21:37
2024-02-20T03:39:05
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I'm running ollama version is 0.1.25 on macOS. It looks like the LLM is no longer taking earlier messages into account, even though they definitely fit in the context window of the models I'm using. I'm having a conversation like this: ``` - User: Here is some text, please summarize it. - Assistant: <outputs a summary> - User: Now, please summarize what you just wrote. - Assistant: <outputs a completely unrelated summary> ``` I've tried both the `llama2` and `mixtral` models. I've tried with the Open WebUI interface, directly with `ollama run --verbose llama2`, and with the OpenAI API talking to my locally-running Ollama. I'm always observing the same behavior: the model simply ignores all context in my second query. This used to work just fine before I updated Ollama (I was using a version a few weeks old, but I don't recall which).
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2595/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2595/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/3122
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3122/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3122/comments
https://api.github.com/repos/ollama/ollama/issues/3122/events
https://github.com/ollama/ollama/pull/3122
2,184,686,312
PR_kwDOJ0Z1Ps5piuel
3,122
Better tmpdir cleanup
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-03-13T18:53:51
2024-03-20T15:28:07
2024-03-20T15:28:03
COLLABORATOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/3122", "html_url": "https://github.com/ollama/ollama/pull/3122", "diff_url": "https://github.com/ollama/ollama/pull/3122.diff", "patch_url": "https://github.com/ollama/ollama/pull/3122.patch", "merged_at": "2024-03-20T15:28:03" }
If expanding the runners fails, don't leave a corrupt/incomplete payloads dir. We now write a pid file out to the tmpdir, which allows us to scan for stale tmpdirs and remove this as long as there isn't still a process running. Fixes #3051 Fixes #2472 Fixes #2658 (indirectly) Verified on Mac, Linux and Windows.
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3122/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3122/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/8399
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8399/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8399/comments
https://api.github.com/repos/ollama/ollama/issues/8399/events
https://github.com/ollama/ollama/issues/8399
2,783,413,053
I_kwDOJ0Z1Ps6l54M9
8,399
unable to use nvidia GPU & how to fix
{ "login": "belmont", "id": 11472085, "node_id": "MDQ6VXNlcjExNDcyMDg1", "avatar_url": "https://avatars.githubusercontent.com/u/11472085?v=4", "gravatar_id": "", "url": "https://api.github.com/users/belmont", "html_url": "https://github.com/belmont", "followers_url": "https://api.github.com/users/belmont/followers", "following_url": "https://api.github.com/users/belmont/following{/other_user}", "gists_url": "https://api.github.com/users/belmont/gists{/gist_id}", "starred_url": "https://api.github.com/users/belmont/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/belmont/subscriptions", "organizations_url": "https://api.github.com/users/belmont/orgs", "repos_url": "https://api.github.com/users/belmont/repos", "events_url": "https://api.github.com/users/belmont/events{/privacy}", "received_events_url": "https://api.github.com/users/belmont/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 6677745918, "node_id": "LA_kwDOJ0Z1Ps8AAAABjgZQ_g", "url": "https://api.github.com/repos/ollama/ollama/labels/gpu", "name": "gpu", "color": "76C49E", "default": false, "description": "" } ]
closed
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
1
2025-01-13T09:17:24
2025-01-15T23:38:22
2025-01-15T23:38:21
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? I have been suffering 3 hours this morning to make nvidia work with ollama fresh install. Whatever model i tried It did not use the nvidia H100 GPUs even if the systemctl status ollama is nicely showing the GPUs. For this you need to install nvidia toolkit. I have picked the latest of driver, toolkit, cuda and ollama did not load in the GPUs. Then i discovered I dont have the AVX enabled in my CPU of the VM. So, i said, its really good to have, so i enabled it and BINGO !!!! , ollama got loaded into the GPU!! Now all good!! ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version 0.5.4
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8399/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8399/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5310
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5310/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5310/comments
https://api.github.com/repos/ollama/ollama/issues/5310/events
https://github.com/ollama/ollama/pull/5310
2,376,280,174
PR_kwDOJ0Z1Ps5zr4Tc
5,310
Update OpenAI Compatibility Docs with Image Chat Support
{ "login": "royjhan", "id": 65097070, "node_id": "MDQ6VXNlcjY1MDk3MDcw", "avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4", "gravatar_id": "", "url": "https://api.github.com/users/royjhan", "html_url": "https://github.com/royjhan", "followers_url": "https://api.github.com/users/royjhan/followers", "following_url": "https://api.github.com/users/royjhan/following{/other_user}", "gists_url": "https://api.github.com/users/royjhan/gists{/gist_id}", "starred_url": "https://api.github.com/users/royjhan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/royjhan/subscriptions", "organizations_url": "https://api.github.com/users/royjhan/orgs", "repos_url": "https://api.github.com/users/royjhan/repos", "events_url": "https://api.github.com/users/royjhan/events{/privacy}", "received_events_url": "https://api.github.com/users/royjhan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-06-26T21:04:55
2024-08-02T20:05:58
2024-08-02T20:05:57
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/5310", "html_url": "https://github.com/ollama/ollama/pull/5310", "diff_url": "https://github.com/ollama/ollama/pull/5310.diff", "patch_url": "https://github.com/ollama/ollama/pull/5310.patch", "merged_at": "2024-08-02T20:05:57" }
Referencing #5208
{ "login": "royjhan", "id": 65097070, "node_id": "MDQ6VXNlcjY1MDk3MDcw", "avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4", "gravatar_id": "", "url": "https://api.github.com/users/royjhan", "html_url": "https://github.com/royjhan", "followers_url": "https://api.github.com/users/royjhan/followers", "following_url": "https://api.github.com/users/royjhan/following{/other_user}", "gists_url": "https://api.github.com/users/royjhan/gists{/gist_id}", "starred_url": "https://api.github.com/users/royjhan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/royjhan/subscriptions", "organizations_url": "https://api.github.com/users/royjhan/orgs", "repos_url": "https://api.github.com/users/royjhan/repos", "events_url": "https://api.github.com/users/royjhan/events{/privacy}", "received_events_url": "https://api.github.com/users/royjhan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5310/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5310/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/6880
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6880/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6880/comments
https://api.github.com/repos/ollama/ollama/issues/6880/events
https://github.com/ollama/ollama/issues/6880
2,537,040,644
I_kwDOJ0Z1Ps6XOCsE
6,880
Feature Request: Support logprobs before GTA 6 comes out
{ "login": "iurimatias", "id": 176720, "node_id": "MDQ6VXNlcjE3NjcyMA==", "avatar_url": "https://avatars.githubusercontent.com/u/176720?v=4", "gravatar_id": "", "url": "https://api.github.com/users/iurimatias", "html_url": "https://github.com/iurimatias", "followers_url": "https://api.github.com/users/iurimatias/followers", "following_url": "https://api.github.com/users/iurimatias/following{/other_user}", "gists_url": "https://api.github.com/users/iurimatias/gists{/gist_id}", "starred_url": "https://api.github.com/users/iurimatias/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/iurimatias/subscriptions", "organizations_url": "https://api.github.com/users/iurimatias/orgs", "repos_url": "https://api.github.com/users/iurimatias/repos", "events_url": "https://api.github.com/users/iurimatias/events{/privacy}", "received_events_url": "https://api.github.com/users/iurimatias/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
{ "login": "ParthSareen", "id": 29360864, "node_id": "MDQ6VXNlcjI5MzYwODY0", "avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ParthSareen", "html_url": "https://github.com/ParthSareen", "followers_url": "https://api.github.com/users/ParthSareen/followers", "following_url": "https://api.github.com/users/ParthSareen/following{/other_user}", "gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}", "starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions", "organizations_url": "https://api.github.com/users/ParthSareen/orgs", "repos_url": "https://api.github.com/users/ParthSareen/repos", "events_url": "https://api.github.com/users/ParthSareen/events{/privacy}", "received_events_url": "https://api.github.com/users/ParthSareen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "ParthSareen", "id": 29360864, "node_id": "MDQ6VXNlcjI5MzYwODY0", "avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ParthSareen", "html_url": "https://github.com/ParthSareen", "followers_url": "https://api.github.com/users/ParthSareen/followers", "following_url": "https://api.github.com/users/ParthSareen/following{/other_user}", "gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}", "starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions", "organizations_url": "https://api.github.com/users/ParthSareen/orgs", "repos_url": "https://api.github.com/users/ParthSareen/repos", "events_url": "https://api.github.com/users/ParthSareen/events{/privacy}", "received_events_url": "https://api.github.com/users/ParthSareen/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
5
2024-09-19T18:02:17
2025-01-07T19:25:00
2025-01-07T19:25:00
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
ERROR: type should be string, got "\r\nhttps://github.com/ollama/ollama/pull/1640#issuecomment-2352584858 AND https://github.com/ollama/ollama/issues/2415#issuecomment-2361193021"
{ "login": "ParthSareen", "id": 29360864, "node_id": "MDQ6VXNlcjI5MzYwODY0", "avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ParthSareen", "html_url": "https://github.com/ParthSareen", "followers_url": "https://api.github.com/users/ParthSareen/followers", "following_url": "https://api.github.com/users/ParthSareen/following{/other_user}", "gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}", "starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions", "organizations_url": "https://api.github.com/users/ParthSareen/orgs", "repos_url": "https://api.github.com/users/ParthSareen/repos", "events_url": "https://api.github.com/users/ParthSareen/events{/privacy}", "received_events_url": "https://api.github.com/users/ParthSareen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6880/reactions", "total_count": 6, "+1": 0, "-1": 0, "laugh": 6, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6880/timeline
null
completed
false