PS:hhh 说的比较机器

1,说明

由于 OpenAI 服务的限制,请使用 Azure OpenAI 服务。需要申请 Azure OpenAI 的使用权限。目前,采用的模型是 GPT-4-1106-preview(也称为 GPT-4 Turbo)。

2,简介

GPT 已经是一个广为人知的热门话题,未来也将继续如此。在我们内部使用的小助手聊天场景中,增加了函数调用功能,通过函数调用的方式进行“联网”。具体的使用方法可以参考 Microsoft 的官方文档: 如何定义函数

3,函数介绍

最新版本的 GPT-3.5 Turbo 和 GPT-4 经过了微调,使其能够使用函数,并能够确定何时以及如何调用这些函数。如果请求中包含一个或多个函数,模型将根据提示的上下文来决定是否应该调用任何函数。当模型决定应该调用函数时,它会以包含函数参数的 JSON 对象形式进行响应。总的来说,函数处理可以分为以下三个步骤:

  • 定义函数:在 Azure OpenAI 服务中定义你想要模型调用的函数。
  • 触发函数:模型会根据输入的提示内容判断是否需要调用函数。
  • 响应处理:模型接收函数的输出,并将其整合到最终的响应中。

4,示例介绍

在官方示例中,提供了两种不同的方法来获取互联网上的最新资料:Azure Search 和 Bing Search。无论选择哪一种方法,最终的目的都是相同的,即访问并检索网络上的最新信息。需要注意的是,所有在线资料都存在一定的“历史性”,这意味着信息可能会随着时间而变得过时,这是不可避免的。

示例中展示了如何定义和调用 Bing Search 函数,以便在聊天助手中实现实时的网络搜索功能。通过这种方式,聊天助手可以在对话中提供最新的信息,增强用户体验。

您可以通过这个链接访问和查看完整的示例: Bing Search 代码示例

CleanShot 2023-12-01 at 11.55.58@2x

5,使用函数及未使用函数的程序代码(自行保存至本地)

5.1,未使用函数
 1import os
 2import openai
 3openai.api_type = "azure"
 4openai.api_version = "2023-07-01-preview"
 5openai.api_base =  "自行获取"
 6openai.api_key = "自行获取"
 7
 8response = openai.ChatCompletion.create(
 9    engine="gpt-4-1106-preview", # The deployment name you chose when you deployed the GPT-3.5-Turbo or GPT-4 model.
10    messages=[
11        {"role": "system", "content": "You are an assistant designed to help people answer questions."},
12        {"role": "user", "content": "你对人工智能了解多少?请在网上搜索相关资料并提供资料!"}
13    ]
14)
15# To print only the response content text:
16print(response['choices'][0]['message']['content'])
17
5.2,使用函数
config.json
 1{
 2    "DEPLOYMENT_NAME":"<Model Deployment Name>",
 3    "OPENAI_API_BASE":"https://<Your Azure Resource Name>.openai.azure.com",
 4    "OPENAI_API_VERSION":"<OpenAI API Version>",
 5
 6    "SEARCH_SERVICE_ENDPOINT": "https://<Your Search Service Name>.search.windows.net",
 7    "SEARCH_INDEX_NAME": "recipes-vectors",
 8    "SEARCH_ADMIN_KEY": "",
 9    "BING_SEARCH_SUBSCRIPTION_KEY": "需要申请search key"
10
11}
function_bing.py
  1import requests
  2import json
  3import openai
  4
  5# Load config values
  6with open(r'config.json') as config_file:
  7    config_details = json.load(config_file)
  8
  9
 10# Configure OpenAI environment variables
 11openai.api_key = config_details['OPENAI_API_KEY']
 12openai.api_base = config_details['OPENAI_API_BASE']
 13openai.api_type = "azure"
 14openai.api_version = config_details['OPENAI_API_VERSION']
 15
 16# You need to use the 0613 version or newer of gpt-35-turbo or gpt-4 to work with functions
 17deployment_name = config_details['DEPLOYMENT_NAME']
 18
 19bing_search_subscription_key = config_details['BING_SEARCH_SUBSCRIPTION_KEY']
 20bing_search_url = "https://api.bing.microsoft.com/v7.0/search"
 21
 22functions = [
 23    {
 24        "name": "search_bing",
 25        "description": "Searches bing to get up to date information from the web",
 26        "parameters": {
 27            "type": "object",
 28            "properties": {
 29                "query": {
 30                    "type": "string",
 31                    "description": "The search query",
 32                }
 33            },
 34            "required": ["query"],
 35        },
 36    }
 37]
 38
 39def search(query):
 40    headers = {"Ocp-Apim-Subscription-Key": bing_search_subscription_key}
 41    params = {"q": query, "textDecorations": False}
 42    response = requests.get(bing_search_url, headers=headers, params=params)
 43    response.raise_for_status()
 44    search_results = response.json()
 45
 46    output = []
 47
 48    for result in search_results['webPages']['value']:
 49        output.append({
 50            'title': result['name'],
 51            'link': result['url'],
 52            'snippet': result['snippet']
 53        })
 54
 55    return json.dumps(output)
 56
 57
 58def run_multiturn_conversation(messages, functions, available_functions, deployment_name):
 59    # Step 1: send the conversation and available functions to GPT
 60
 61    response = openai.ChatCompletion.create(
 62        deployment_id=deployment_name,
 63        messages=messages,
 64        functions=functions,
 65        function_call="auto",
 66        temperature=0
 67    )
 68
 69    # Step 2: check if GPT wanted to call a function
 70    while response["choices"][0]["finish_reason"] == 'function_call':
 71        response_message = response["choices"][0]["message"]
 72        print("Recommended Function call:")
 73        print(response_message.get("function_call"))
 74        print()
 75
 76        # Step 3: call the function
 77        # Note: the JSON response may not always be valid; be sure to handle errors
 78
 79        function_name = response_message["function_call"]["name"]
 80
 81        # verify function exists
 82        if function_name not in available_functions:
 83            return "Function " + function_name + " does not exist"
 84        function_to_call = available_functions[function_name]
 85
 86        function_args = json.loads(
 87            response_message["function_call"]["arguments"])
 88        function_response = function_to_call(**function_args)
 89
 90        print("Output of function call:")
 91        print(function_response)
 92        print()
 93
 94        # Step 4: send the info on the function call and function response to GPT
 95
 96        # adding assistant response to messages
 97        messages.append(
 98            {
 99                "role": response_message["role"],
100                "function_call": {
101                    "name": response_message["function_call"]["name"],
102                    "arguments": response_message["function_call"]["arguments"],
103                },
104                "content": None
105            }
106        )
107
108        # adding function response to messages
109        messages.append(
110            {
111                "role": "function",
112                "name": function_name,
113                "content": function_response,
114            }
115        )  # extend conversation with function response
116
117        print("Messages in next request:")
118        for message in messages:
119            print(message)
120        print()
121
122        response = openai.ChatCompletion.create(
123            messages=messages,
124            deployment_id=deployment_name,
125            function_call="auto",
126            functions=functions,
127            temperature=0
128        )  # get a new response from GPT where it can see the function response
129
130    return response
131
132
133system_message = """You are an assistant designed to help people answer questions.
134
135You have access to query the web using Bing Search. You should call bing search whenever a question requires up to date information or could benefit from web data.
136"""
137
138messages = [{"role": "system", "content": system_message},
139            {"role": "user", "content": "你对人工智能了解多少?请在网上搜索相关资料并提供资料!"}]
140
141
142available_functions = {'search_bing': search}
143
144result = run_multiturn_conversation(
145    messages, functions, available_functions, deployment_name)
146
147print("Final response:")
148print(result['choices'][0]['message']['content'])
149

6,结果对比(灰色未使用函数、黑色使用函数)

问题1:你对人工智能了解多少?请在网上搜索相关资料并提供资料!

CleanShot 2023-12-01 at 11.53.03

问题2:介绍下OpenAI CEO,他有哪些杰出的成就?请提供几个资料参考地址!

CleanShot 2023-12-01 at 12.19.24@2x

问题3:人工智能的前沿资讯信息都有哪些?

CleanShot 2023-12-01 at 12.31.01