
Have you ever considered the power of integrating a built-in search platform directly within your AlloyDB PostgreSQL database? Imagine the ease and efficiency of searching your critical data using enterprise-ready tooling.
Agentspace Assistant and Agent Development Kit from Vertex AI is now integrated with AlloyDB — Now you can both use the Agentspace Assistant to securely search AlloyDB as well as build custom Agents with AlloyDB for your agentic workflows.
This article introduces the revolutionary capabilities of the newly launched Agentspace with AlloyDB, empowering you to query and search your AlloyDB data with the unparalleled advanced reasoning of Gemini and Google-quality search. This direct integration is crucial, providing Agentspace users with the ability to obtain answers based on real-time enterprise knowledge, pulled directly from your live AlloyDB data. This ensures that the natural language answers provided are not only intuitive but also accurate and up-to-date, truly reflecting the current state of your business operations. We will also talk about how we can easily set this up, working instantaneously on enterprise operational data on AlloyDB.
Let’s start off with introducing our flagship Google Agentspace — There is a long-standing problem of fragmented technology systems within businesses. Years ago, you had a lot of technology that didn’t work that well. Today, you have a lot of technology that works amazing, but very little of it works together.
An extraordinary proliferation of cutting-edge technologies — such as Large Language Models (LLMs), intelligent Agents, Generative AI, alongside tools enabling automated workflows and enterprise-wide search — is currently underway, each holding immense promise. Yet, this rapid innovation, while stimulating, also introduces a fresh wave of complexity. Businesses frequently pose the following inquiries:
- How can we comprehend the vast array of these novel tools?
- What is their interrelationship?
- Crucially, how can they be effectively utilized to resolve genuine business challenges?
The fundamental obstacle lies not merely in understanding these technologies in isolation, but in leveraging them to attain tangible outcomes. It is imperative to be able to:
- Access information swiftly: Whether for employees or customers, timely access to pertinent information is paramount. How can these new AI-driven tools facilitate the immediate sifting through extensive data to reveal what is relevant?
- Connect data sources: The majority of businesses contend with data dispersed across disparate systems, silos, and formats. The true potency emerges when these divergent sources can be linked. How can LLMs and agents aid in bridging these voids and forging a more cohesive perspective of your data?
- Take action: Ultimately, insights are only valuable when they culminate in action. How can these technologies be integrated into workflows to not only analyze or predict, but to actively support decision-making and automate tasks?
This is precisely where Google Cloud intervenes. We acknowledge the daunting nature of navigating this intricate new technological landscape. Our objective is to abstract away a significant portion of this underlying complexity. We aim to be your steadfast partner in streamlining this journey, ensuring you can capitalize on these advancements without becoming entangled in their intricacies.
What if you could pull together search, AI and agents across your enterprise into a single platform? You could fundamentally change the way your workforce interacts with information, and in turn, drive positive business outcomes. This is where Google Agentspace comes in — By providing a single, secure platform to build, manage, and adopt AI agents at scale and unlock the true potential of individuals, organizations, and even entire enterprises, it can deliver business value immediately using your existing data, so you don’t have to wait for long data and AI integration projects just to get started. Powering this search capability with your enterprise data in AlloyDB and leveraging its Natural Language features helps enterprises search structured and unstructured operational data.
Let’s jump to the fun part now — how can we set this up from our Google Cloud consoles. Consider the below data, already available in my AlloyDB database with an “employee” table, consisting of employee details in an organisation. (Fun tip: Generated the schema and the ingested data from Agentspace itself 😏)
The data here comprises employee level information, starting from the CEO, all the way down to vendors, with data about their personal level information, organisational level, and business entities that they are a part of.
Start off with opting into Agentspace connectivity with AlloyDB and AlloyDB AI on the forms here and here respectively. Once access has been granted, you would need to install the AlloyDB AI Natural Language extension (don’t worry, no downtime on your cluster!) using the below command:
CREATE EXTENSION alloydb_ai_nl cascade;
AlloyDB AI Natural Language uses nl_config to associate applications to certain schemas, query templates and model endpoints — a large application could essentially use different configurations, for different parts of an application, as long as you specify the right configuration when a question or prompt is sent across.
For this specific example, to create a natural language configuration, follow the below steps (a brief description of what that command does is mentioned right below the SQL command for clarity)
- Creating a natural language configuratonSELECT
alloydb_ai_nl.g_create_configuration(
'my_app_config' - configuration_id
);
- Register a schema for a specified configuration
SELECT
alloydb_ai_nl.g_manage_configuration(
operation => 'register_schema',
configuration_id_in => 'my_app_config',
schema_names_in => '{my_schema}'
);
- Add contect for the configuration rules
SELECT
alloydb_ai_nl.g_manage_configuration(
'add_general_context',
'my_app_config',
general_context_in => '{"If the user asks for a certain name and multiple entries exist for the same name, return all the results rather than the most suitable one"}'
);
- To generate contexts for schema objects, call the following APIs. For best results, make sure that the database tables contain representative data. For all schema objects (tables, views, materialized views and columns) within the scope of a provided nl_config.
SELECT
alloydb_ai_nl.generate_schema_context(
'my_app_config' - nl_config
);
- Review the generated schema contexts by running the following statements [Sanity checks]
SELECT schema_object, object_context
FROM alloydb_ai_nl.generated_schema_context_view;
- For all schema objects (tables, views, materialized views and columns)
- within the scope of nl_config.
SELECT
alloydb_ai_nl.apply_generated_schema_context(
'my_app_config' - nl_config
);
- For table, view or materialized view.
SELECT
alloydb_ai_nl.get_relation_context(
'my_schema.my_table'
);
- For column
SELECT
alloydb_ai_nl.get_column_context(
'my_schema.my_table.column1'
);
Once set up, you would need to set up authentication between AlloyDB and Agentspace, by running the below commands:
CREATE ROLE agentspace WITH LOGIN PASSWORD '';
GRANT SELECT ON TABLE "public"."employee" TO agentspace;
--Replace 'public' and 'employee' with the database and table name respectively
Additionally, make sure that you Grant the Cloud AlloyDB for PostgreSQL Database User role to the following principal: service-PROJECT_NUMBER@gcp-sa-discoveryengine.iam.gserviceaccount.com This would be followed by the final step of linking Agentspace with AlloyDB.
Next, create a datastore in Agentspace with your AlloyDB for PostgreSQL connection details using the API.
curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json" \
-H "X-Goog-User-Project: archana-primary" \
"https://discoveryengine.googleapis.com/v1alpha/projects/archana-primary/locations/global/collections/default_collection/dataStores?dataStoreId=alloydb" -d '{
"displayName": "DATA_STORE_NAME",
"federatedSearchConfig": {
"alloyDbConfig": {
"alloydbConnectionConfig": {
"instance": "alloy-data",
"database": "postgres",
"user": "postgres",
"password": "Password",
"authMode": "AUTH_MODE_SERVICE_ACCOUNT"
},
"alloydb_ai_nl_config": { "nlConfigId": "my_app_config" }
}
},
"industryVertical": "GENERIC",
"solutionTypes": ["SOLUTION_TYPE_SEARCH"]
}'
Upload the schema information of your AlloyDB for PostgreSQL database, which would be the schema of my employee table data.
curl -X PATCH \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json" \
"https://discoveryengine.googleapis.com/v1beta/projects/archana-primary/locations/global/collections/default_collection/dataStores/alloydb/schemas/default_schema" \
-d '{
"structSchema": {
"$schema": "https://json-schema.org/draft/2020-12/schema",
"type": "object",
"properties": {
"employee": {
"type": "object",
"properties": {
"rownumber": { "type": "integer" },
"businessentityid": { "type": "integer"},
"nationalidnumber": { "type": "character", "keyPropertyMapping": "national ID" },
"loginid": { "type": "character" },
"organizationlevel": { "type": "integer"},
"jobtitle": { "type": "character", "keyPropertyMapping": "job title" },
"birthdate": { "type": "date", "keyPropertyMapping": "birth date" },
"maritalstatus": { "type": "character", "keyPropertyMapping": "marital status" },
"gender": { "type": "character", "keyPropertyMapping": "Gender" },
"hiredate": { "type": "date", "keyPropertyMapping": "Hire date" },
"salariedflag": { "type": "character", "keyPropertyMapping": "salaried flag" },
"vacationhours": { "type": "smallint", "keyPropertyMapping": "vacation hours" },
"sickleavehours": { "type": "smallint", "keyPropertyMapping": "sick leave hours" },
"currentflag": { "type": "character"},
"rowguid": { "type": "character"},
"modifieddate": { "type": "date", "keyPropertyMapping": "modified date" }
}
}
}
}
}'
Finally, update the UI configuration so that search results from AlloyDB for PostgreSQL are displayed correctly in Agentspace. This is an optional step, usually needed if you need to add some context to your data and make it more human readable (no need if the column namings are self explanatory and 90% of the time not needed AT ALL!)
curl -X PATCH \
-H "X-Goog-User-Project: archana-primary" \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json" \
https://discoveryengine.googleapis.com/v1alpha/projects/archana-primary/locations/global/collections/default_collection/engines/enterprise-search-alloydb-_1747151802542 /widgetConfigs/default_search_widget_config?updateMask=uiSettings -d '{
"uiSettings": {
"dataStoreUiConfigs": [
{
"name": "projects/archana-primary/locations/global/collections/default_collection/dataStores/alloydb",
"id": "alloydb",
"fieldsUiComponentsMap": {
"title": {
"field": "title",
"displayTemplate": "{value}"
},
"text1": {
"field": "description",
"displayTemplate": "{value}"
},
"url": {
"field": "url",
"displayTemplate": "{value}"
}
}
}
],
"interactionType": "SEARCH_ONLY"
}
}'
And with that, your enterprise data on AlloyDB is connected to Agentspace — go and query it from Agentspace, live, with no further modifications needed!
The obvious question that comes up — what if I want to make this whole process simpler? Well good news, the latter part of having to define a datastore and having to define the schema will be completely eliminated down the line, to make this more scalable and efficient for organisations. Furthermore, the other question that comes up, where you would like to join data from multiple sources, this would mean updating your config file (as of today) with multiple tables as shown here.
Source Credit: https://medium.com/google-cloud/agentspace-alloydb-search-on-cloud-to-be-enterprise-ready-for-production-de39ac04816c?source=rss—-e52cf94d98af—4