SimpleChatTC:Cleanup:ChatProps: iRecentUserMsgCnt
Update Me class Update show settings Update show props info Update readme
This commit is contained in:
parent
7409b29862
commit
78ccca056f
|
|
@ -191,6 +191,17 @@ It is attached to the document object. Some of these can also be updated using t
|
||||||
inturn the machine goes into power saving mode or so, the platform may stop network connection,
|
inturn the machine goes into power saving mode or so, the platform may stop network connection,
|
||||||
leading to exception.
|
leading to exception.
|
||||||
|
|
||||||
|
iRecentUserMsgCnt - a simple minded SlidingWindow to limit context window load at Ai Model end.
|
||||||
|
This is set to 10 by default. So in addition to latest system message, last/latest iRecentUserMsgCnt
|
||||||
|
user messages after the latest system prompt and its responses from the ai model will be sent
|
||||||
|
to the ai-model, when querying for a new response. Note that if enabled, only user messages after
|
||||||
|
the latest system message/prompt will be considered.
|
||||||
|
|
||||||
|
This specified sliding window user message count also includes the latest user query.
|
||||||
|
<0 : Send entire chat history to server
|
||||||
|
0 : Send only the system message if any to the server
|
||||||
|
>0 : Send the latest chat history from the latest system prompt, limited to specified cnt.
|
||||||
|
|
||||||
tools - contains controls related to tool calling
|
tools - contains controls related to tool calling
|
||||||
|
|
||||||
enabled - control whether tool calling is enabled or not
|
enabled - control whether tool calling is enabled or not
|
||||||
|
|
@ -250,22 +261,12 @@ It is attached to the document object. Some of these can also be updated using t
|
||||||
Content-Type is set to application/json. Additionally Authorization entry is provided, which can
|
Content-Type is set to application/json. Additionally Authorization entry is provided, which can
|
||||||
be set if needed using the settings ui.
|
be set if needed using the settings ui.
|
||||||
|
|
||||||
iRecentUserMsgCnt - a simple minded SlidingWindow to limit context window load at Ai Model end.
|
|
||||||
This is set to 10 by default. So in addition to latest system message, last/latest iRecentUserMsgCnt
|
|
||||||
user messages after the latest system prompt and its responses from the ai model will be sent
|
|
||||||
to the ai-model, when querying for a new response. Note that if enabled, only user messages after
|
|
||||||
the latest system message/prompt will be considered.
|
|
||||||
|
|
||||||
This specified sliding window user message count also includes the latest user query.
|
By using gMe's chatProps.iRecentUserMsgCnt and apiRequestOptions.max_tokens/n_predict one can try to
|
||||||
<0 : Send entire chat history to server
|
control the implications of loading of the ai-model's context window by chat history, wrt chat response
|
||||||
0 : Send only the system message if any to the server
|
to some extent in a simple crude way. You may also want to control the context size enabled when the
|
||||||
>0 : Send the latest chat history from the latest system prompt, limited to specified cnt.
|
server loads ai-model, on the server end. One can look at the current context size set on the server
|
||||||
|
end by looking at the settings/info block shown when ever one switches-to/is-shown a new session.
|
||||||
|
|
||||||
By using gMe's iRecentUserMsgCnt and apiRequestOptions.max_tokens/n_predict one can try to control
|
|
||||||
the implications of loading of the ai-model's context window by chat history, wrt chat response to
|
|
||||||
some extent in a simple crude way. You may also want to control the context size enabled when the
|
|
||||||
server loads ai-model, on the server end.
|
|
||||||
|
|
||||||
|
|
||||||
Sometimes the browser may be stuborn with caching of the file, so your updates to html/css/js
|
Sometimes the browser may be stuborn with caching of the file, so your updates to html/css/js
|
||||||
|
|
@ -288,8 +289,8 @@ the system prompt, anytime during the conversation or only at the beginning.
|
||||||
By default things are setup to try and make the user experience a bit better, if possible.
|
By default things are setup to try and make the user experience a bit better, if possible.
|
||||||
However a developer when testing the server of ai-model may want to change these value.
|
However a developer when testing the server of ai-model may want to change these value.
|
||||||
|
|
||||||
Using iRecentUserMsgCnt reduce chat history context sent to the server/ai-model to be
|
Using chatProps.iRecentUserMsgCnt reduce chat history context sent to the server/ai-model to be
|
||||||
just the system-prompt, prev-user-request-and-ai-response and cur-user-request, instead of
|
just the system-prompt, few prev-user-requests-and-ai-responses and cur-user-request, instead of
|
||||||
full chat history. This way if there is any response with garbage/repeatation, it doesnt
|
full chat history. This way if there is any response with garbage/repeatation, it doesnt
|
||||||
mess with things beyond the next question/request/query, in some ways. The trim garbage
|
mess with things beyond the next question/request/query, in some ways. The trim garbage
|
||||||
option also tries to help avoid issues with garbage in the context to an extent.
|
option also tries to help avoid issues with garbage in the context to an extent.
|
||||||
|
|
|
||||||
|
|
@ -413,7 +413,7 @@ class SimpleChat {
|
||||||
div.replaceChildren();
|
div.replaceChildren();
|
||||||
}
|
}
|
||||||
let last = undefined;
|
let last = undefined;
|
||||||
for(const x of this.recent_chat(gMe.iRecentUserMsgCnt)) {
|
for(const x of this.recent_chat(gMe.chatProps.iRecentUserMsgCnt)) {
|
||||||
let entry = ui.el_create_append_p(`${x.ns.role}: ${x.content_equiv()}`, div);
|
let entry = ui.el_create_append_p(`${x.ns.role}: ${x.content_equiv()}`, div);
|
||||||
entry.className = `role-${x.ns.role}`;
|
entry.className = `role-${x.ns.role}`;
|
||||||
last = entry;
|
last = entry;
|
||||||
|
|
@ -473,7 +473,7 @@ class SimpleChat {
|
||||||
*/
|
*/
|
||||||
request_messages_jsonstr() {
|
request_messages_jsonstr() {
|
||||||
let req = {
|
let req = {
|
||||||
messages: this.recent_chat_ns(gMe.iRecentUserMsgCnt),
|
messages: this.recent_chat_ns(gMe.chatProps.iRecentUserMsgCnt),
|
||||||
}
|
}
|
||||||
return this.request_jsonstr_extend(req);
|
return this.request_jsonstr_extend(req);
|
||||||
}
|
}
|
||||||
|
|
@ -485,7 +485,7 @@ class SimpleChat {
|
||||||
request_prompt_jsonstr(bInsertStandardRolePrefix) {
|
request_prompt_jsonstr(bInsertStandardRolePrefix) {
|
||||||
let prompt = "";
|
let prompt = "";
|
||||||
let iCnt = 0;
|
let iCnt = 0;
|
||||||
for(const msg of this.recent_chat(gMe.iRecentUserMsgCnt)) {
|
for(const msg of this.recent_chat(gMe.chatProps.iRecentUserMsgCnt)) {
|
||||||
iCnt += 1;
|
iCnt += 1;
|
||||||
if (iCnt > 1) {
|
if (iCnt > 1) {
|
||||||
prompt += "\n";
|
prompt += "\n";
|
||||||
|
|
@ -1021,11 +1021,11 @@ class Me {
|
||||||
};
|
};
|
||||||
this.chatProps = {
|
this.chatProps = {
|
||||||
stream: true,
|
stream: true,
|
||||||
}
|
iRecentUserMsgCnt: 10,
|
||||||
|
};
|
||||||
this.bCompletionFreshChatAlways = true;
|
this.bCompletionFreshChatAlways = true;
|
||||||
this.bCompletionInsertStandardRolePrefix = false;
|
this.bCompletionInsertStandardRolePrefix = false;
|
||||||
this.bTrimGarbage = true;
|
this.bTrimGarbage = true;
|
||||||
this.iRecentUserMsgCnt = 10;
|
|
||||||
/** @type {Object<string, number>} */
|
/** @type {Object<string, number>} */
|
||||||
this.sRecentUserMsgCnt = {
|
this.sRecentUserMsgCnt = {
|
||||||
"Full": -1,
|
"Full": -1,
|
||||||
|
|
@ -1094,9 +1094,9 @@ class Me {
|
||||||
* @param {boolean} bAll
|
* @param {boolean} bAll
|
||||||
*/
|
*/
|
||||||
show_info(elDiv, bAll=false) {
|
show_info(elDiv, bAll=false) {
|
||||||
let props = ["baseURL", "modelInfo","headers", "tools", "apiRequestOptions", "apiEP", "chatProps", "iRecentUserMsgCnt", "bTrimGarbage", "bCompletionFreshChatAlways", "bCompletionInsertStandardRolePrefix"];
|
let props = ["baseURL", "modelInfo","headers", "tools", "apiRequestOptions", "apiEP", "chatProps", "bTrimGarbage", "bCompletionFreshChatAlways", "bCompletionInsertStandardRolePrefix"];
|
||||||
if (!bAll) {
|
if (!bAll) {
|
||||||
props = [ "baseURL", "modelInfo", "headers", "tools", "apiRequestOptions", "apiEP", "chatProps", "iRecentUserMsgCnt" ];
|
props = [ "baseURL", "modelInfo", "headers", "tools", "apiRequestOptions", "apiEP", "chatProps" ];
|
||||||
}
|
}
|
||||||
fetch(`${this.baseURL}/props`).then(resp=>resp.json()).then(json=>{
|
fetch(`${this.baseURL}/props`).then(resp=>resp.json()).then(json=>{
|
||||||
this.modelInfo = {
|
this.modelInfo = {
|
||||||
|
|
@ -1112,12 +1112,12 @@ class Me {
|
||||||
* @param {HTMLDivElement} elDiv
|
* @param {HTMLDivElement} elDiv
|
||||||
*/
|
*/
|
||||||
show_settings(elDiv) {
|
show_settings(elDiv) {
|
||||||
ui.ui_show_obj_props_edit(elDiv, "", this, ["baseURL", "headers", "tools", "apiRequestOptions", "apiEP", "chatProps", "iRecentUserMsgCnt", "bTrimGarbage", "bCompletionFreshChatAlways", "bCompletionInsertStandardRolePrefix"], "Settings", (prop, elProp)=>{
|
ui.ui_show_obj_props_edit(elDiv, "", this, ["baseURL", "headers", "tools", "apiRequestOptions", "apiEP", "chatProps", "bTrimGarbage", "bCompletionFreshChatAlways", "bCompletionInsertStandardRolePrefix"], "Settings", (prop, elProp)=>{
|
||||||
if (prop == "headers:Authorization") {
|
if (prop == "headers:Authorization") {
|
||||||
// @ts-ignore
|
// @ts-ignore
|
||||||
elProp.placeholder = "Bearer OPENAI_API_KEY";
|
elProp.placeholder = "Bearer OPENAI_API_KEY";
|
||||||
}
|
}
|
||||||
}, [":apiEP", ":iRecentUserMsgCnt"], (propWithPath, prop, elParent)=>{
|
}, [":apiEP", ":chatProps:iRecentUserMsgCnt"], (propWithPath, prop, elParent)=>{
|
||||||
if (propWithPath == ":apiEP") {
|
if (propWithPath == ":apiEP") {
|
||||||
let sel = ui.el_creatediv_select("SetApiEP", "ApiEndPoint", ApiEP.Type, this.apiEP, (val)=>{
|
let sel = ui.el_creatediv_select("SetApiEP", "ApiEndPoint", ApiEP.Type, this.apiEP, (val)=>{
|
||||||
// @ts-ignore
|
// @ts-ignore
|
||||||
|
|
@ -1125,9 +1125,9 @@ class Me {
|
||||||
});
|
});
|
||||||
elParent.appendChild(sel.div);
|
elParent.appendChild(sel.div);
|
||||||
}
|
}
|
||||||
if (propWithPath == ":iRecentUserMsgCnt") {
|
if (propWithPath == ":chatProps:iRecentUserMsgCnt") {
|
||||||
let sel = ui.el_creatediv_select("SetChatHistoryInCtxt", "ChatHistoryInCtxt", this.sRecentUserMsgCnt, this.iRecentUserMsgCnt, (val)=>{
|
let sel = ui.el_creatediv_select("SetChatHistoryInCtxt", "ChatHistoryInCtxt", this.sRecentUserMsgCnt, this.chatProps.iRecentUserMsgCnt, (val)=>{
|
||||||
this.iRecentUserMsgCnt = this.sRecentUserMsgCnt[val];
|
this.chatProps.iRecentUserMsgCnt = this.sRecentUserMsgCnt[val];
|
||||||
});
|
});
|
||||||
elParent.appendChild(sel.div);
|
elParent.appendChild(sel.div);
|
||||||
}
|
}
|
||||||
|
|
|
||||||
Loading…
Reference in New Issue