feat: add native azure openai configuration (#4)
chore: also fixed markdown lint errors
This commit is contained in:
parent
b1d7debd49
commit
b3132f4e6f
56
README.md
56
README.md
@ -2,14 +2,10 @@
|
|||||||
|
|
||||||
**avante.nvim** is a Neovim plugin designed to emulate the behavior of the [Cursor](https://www.cursor.com) AI IDE, providing users with AI-driven code suggestions and the ability to apply these recommendations directly to their source files with minimal effort.
|
**avante.nvim** is a Neovim plugin designed to emulate the behavior of the [Cursor](https://www.cursor.com) AI IDE, providing users with AI-driven code suggestions and the ability to apply these recommendations directly to their source files with minimal effort.
|
||||||
|
|
||||||
|
|
||||||
⚠️⚠️ **WARNING: This plugin is still in a very early stage of development, so please be aware that the current code is very messy and unstable, and problems are likely to occur.**
|
⚠️⚠️ **WARNING: This plugin is still in a very early stage of development, so please be aware that the current code is very messy and unstable, and problems are likely to occur.**
|
||||||
|
|
||||||
|
|
||||||
https://github.com/user-attachments/assets/510e6270-b6cf-459d-9a2f-15b397d1fe53
|
https://github.com/user-attachments/assets/510e6270-b6cf-459d-9a2f-15b397d1fe53
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
## Features
|
## Features
|
||||||
|
|
||||||
- **AI-Powered Code Assistance**: Interact with AI to ask questions about your current code file and receive intelligent suggestions for improvement or modification.
|
- **AI-Powered Code Assistance**: Interact with AI to ask questions about your current code file and receive intelligent suggestions for improvement or modification.
|
||||||
@ -44,13 +40,20 @@ Default setup configuration:
|
|||||||
|
|
||||||
```lua
|
```lua
|
||||||
{
|
{
|
||||||
provider = "claude", -- openai, claude
|
provider = "claude", -- "claude" or "openai" or "azure"
|
||||||
openai = {
|
openai = {
|
||||||
endpoint = "https://api.openai.com",
|
endpoint = "https://api.openai.com",
|
||||||
model = "gpt-4o",
|
model = "gpt-4o",
|
||||||
temperature = 0,
|
temperature = 0,
|
||||||
max_tokens = 4096,
|
max_tokens = 4096,
|
||||||
},
|
},
|
||||||
|
azure = {
|
||||||
|
endpoint = "", -- Example: "https://<your-resource-name>.openai.azure.com"
|
||||||
|
deployment = "", -- Azure deployment name (e.g., "gpt-4o", "my-gpt-4o-deployment")
|
||||||
|
api_version = "2024-05-13",
|
||||||
|
temperature = 0,
|
||||||
|
max_tokens = 4096,
|
||||||
|
},
|
||||||
claude = {
|
claude = {
|
||||||
endpoint = "https://api.anthropic.com",
|
endpoint = "https://api.anthropic.com",
|
||||||
model = "claude-3-5-sonnet-20240620",
|
model = "claude-3-5-sonnet-20240620",
|
||||||
@ -81,14 +84,26 @@ Default setup configuration:
|
|||||||
|
|
||||||
Given its early stage, `avante.nvim` currently supports the following basic functionalities:
|
Given its early stage, `avante.nvim` currently supports the following basic functionalities:
|
||||||
|
|
||||||
1. Set `ANTHROPIC_API_KEY` environment variable:
|
1. Set the appropriate API key as an environment variable:
|
||||||
```sh
|
|
||||||
export ANTHROPIC_API=your-api-key
|
For Claude:
|
||||||
```
|
|
||||||
Or set `OPENAI_API_KEY` environment variable:
|
```sh
|
||||||
```sh
|
export ANTHROPIC_API_KEY=your-api-key
|
||||||
export OPENAI_API_KEY=your-api-key
|
```
|
||||||
```
|
|
||||||
|
For OpenAI:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
export OPENAI_API_KEY=your-api-key
|
||||||
|
```
|
||||||
|
|
||||||
|
For Azure OpenAI:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
export AZURE_OPENAI_API_KEY=your-api-key
|
||||||
|
```
|
||||||
|
|
||||||
2. Open a code file in Neovim.
|
2. Open a code file in Neovim.
|
||||||
3. Use the `:AvanteAsk` command to query the AI about the code.
|
3. Use the `:AvanteAsk` command to query the AI about the code.
|
||||||
4. Review the AI's suggestions.
|
4. Review the AI's suggestions.
|
||||||
@ -100,13 +115,13 @@ export OPENAI_API_KEY=your-api-key
|
|||||||
|
|
||||||
The following key bindings are available for use with `avante.nvim`:
|
The following key bindings are available for use with `avante.nvim`:
|
||||||
|
|
||||||
- <kbd>Leader</kbd><kbd>a</kbd><kbd>a</kbd> — show sidebar
|
- `<kbd>Leader</kbd><kbd>a</kbd><kbd>a</kbd>` — show sidebar
|
||||||
- <kbd>c</kbd><kbd>o</kbd> — choose ours
|
- `<kbd>c</kbd><kbd>o</kbd>` — choose ours
|
||||||
- <kbd>c</kbd><kbd>t</kbd> — choose theirs
|
- `<kbd>c</kbd><kbd>t</kbd>` — choose theirs
|
||||||
- <kbd>c</kbd><kbd>b</kbd> — choose both
|
- `<kbd>c</kbd><kbd>b</kbd>` — choose both
|
||||||
- <kbd>c</kbd><kbd>0</kbd> — choose none
|
- `<kbd>c</kbd><kbd>0</kbd>` — choose none
|
||||||
- <kbd>]</kbd><kbd>x</kbd> — move to previous conflict
|
- `<kbd>]</kbd><kbd>x</kbd>` — move to previous conflict
|
||||||
- <kbd>[</kbd><kbd>x</kbd> — move to next conflict
|
- `<kbd>[</kbd><kbd>x</kbd>` — move to next conflict
|
||||||
|
|
||||||
## Roadmap
|
## Roadmap
|
||||||
|
|
||||||
@ -130,6 +145,7 @@ Contributions to avante.nvim are welcome! If you're interested in helping out, p
|
|||||||
## Development
|
## Development
|
||||||
|
|
||||||
To set up the development environment:
|
To set up the development environment:
|
||||||
|
|
||||||
1. Install [StyLua](https://github.com/JohnnyMorganz/StyLua) for Lua code formatting.
|
1. Install [StyLua](https://github.com/JohnnyMorganz/StyLua) for Lua code formatting.
|
||||||
2. Install [pre-commit](https://pre-commit.com) for managing and maintaining pre-commit hooks.
|
2. Install [pre-commit](https://pre-commit.com) for managing and maintaining pre-commit hooks.
|
||||||
3. After cloning the repository, run the following command to set up pre-commit hooks:
|
3. After cloning the repository, run the following command to set up pre-commit hooks:
|
||||||
|
@ -298,12 +298,49 @@ local function call_openai_api_stream(question, code_lang, code_content, on_chun
|
|||||||
.. code_content
|
.. code_content
|
||||||
.. "\n```"
|
.. "\n```"
|
||||||
|
|
||||||
local url = utils.trim_suffix(M.config.openai.endpoint, "/") .. "/v1/chat/completions"
|
local url, headers, body
|
||||||
if M.config.provider == "azure" then
|
if M.config.provider == "azure" then
|
||||||
url = M.config.openai.endpoint
|
api_key = os.getenv("AZURE_OPENAI_API_KEY") or os.getenv("OPENAI_API_KEY")
|
||||||
|
if not api_key then
|
||||||
|
error("Azure OpenAI API key is not set. Please set AZURE_OPENAI_API_KEY or OPENAI_API_KEY environment variable.")
|
||||||
|
end
|
||||||
|
url = M.config.azure.endpoint
|
||||||
|
.. "/openai/deployments/"
|
||||||
|
.. M.config.azure.deployment
|
||||||
|
.. "/chat/completions?api-version="
|
||||||
|
.. M.config.azure.api_version
|
||||||
|
headers = {
|
||||||
|
["Content-Type"] = "application/json",
|
||||||
|
["api-key"] = api_key,
|
||||||
|
}
|
||||||
|
body = {
|
||||||
|
messages = {
|
||||||
|
{ role = "system", content = system_prompt },
|
||||||
|
{ role = "user", content = user_prompt },
|
||||||
|
},
|
||||||
|
temperature = M.config.azure.temperature,
|
||||||
|
max_tokens = M.config.azure.max_tokens,
|
||||||
|
stream = true,
|
||||||
|
}
|
||||||
|
else
|
||||||
|
url = utils.trim_suffix(M.config.openai.endpoint, "/") .. "/v1/chat/completions"
|
||||||
|
headers = {
|
||||||
|
["Content-Type"] = "application/json",
|
||||||
|
["Authorization"] = "Bearer " .. api_key,
|
||||||
|
}
|
||||||
|
body = {
|
||||||
|
model = M.config.openai.model,
|
||||||
|
messages = {
|
||||||
|
{ role = "system", content = system_prompt },
|
||||||
|
{ role = "user", content = user_prompt },
|
||||||
|
},
|
||||||
|
temperature = M.config.openai.temperature,
|
||||||
|
max_tokens = M.config.openai.max_tokens,
|
||||||
|
stream = true,
|
||||||
|
}
|
||||||
end
|
end
|
||||||
|
|
||||||
print("Sending request to OpenAI API...")
|
print("Sending request to " .. (M.config.provider == "azure" and "Azure OpenAI" or "OpenAI") .. " API...")
|
||||||
|
|
||||||
curl.post(url, {
|
curl.post(url, {
|
||||||
---@diagnostic disable-next-line: unused-local
|
---@diagnostic disable-next-line: unused-local
|
||||||
@ -334,21 +371,8 @@ local function call_openai_api_stream(question, code_lang, code_content, on_chun
|
|||||||
end)
|
end)
|
||||||
end
|
end
|
||||||
end,
|
end,
|
||||||
headers = {
|
headers = headers,
|
||||||
["Content-Type"] = "application/json",
|
body = fn.json_encode(body),
|
||||||
["Authorization"] = "Bearer " .. api_key,
|
|
||||||
["api_key"] = api_key,
|
|
||||||
},
|
|
||||||
body = fn.json_encode({
|
|
||||||
model = M.config.openai.model,
|
|
||||||
messages = {
|
|
||||||
{ role = "system", content = system_prompt },
|
|
||||||
{ role = "user", content = user_prompt },
|
|
||||||
},
|
|
||||||
temperature = M.config.openai.temperature,
|
|
||||||
max_tokens = M.config.openai.max_tokens,
|
|
||||||
stream = true,
|
|
||||||
}),
|
|
||||||
})
|
})
|
||||||
end
|
end
|
||||||
|
|
||||||
@ -711,6 +735,13 @@ M.config = {
|
|||||||
temperature = 0,
|
temperature = 0,
|
||||||
max_tokens = 4096,
|
max_tokens = 4096,
|
||||||
},
|
},
|
||||||
|
azure = {
|
||||||
|
endpoint = "", -- example: "https://<your-resource-name>.openai.azure.com"
|
||||||
|
deployment = "", -- Azure deployment name (e.g., "gpt-4o", "my-gpt-4o-deployment")
|
||||||
|
api_version = "2024-05-13",
|
||||||
|
temperature = 0,
|
||||||
|
max_tokens = 4096,
|
||||||
|
},
|
||||||
claude = {
|
claude = {
|
||||||
endpoint = "https://api.anthropic.com",
|
endpoint = "https://api.anthropic.com",
|
||||||
model = "claude-3-5-sonnet-20240620",
|
model = "claude-3-5-sonnet-20240620",
|
||||||
|
Loading…
x
Reference in New Issue
Block a user