How can we help?

Connecting Your Website as a Knowledge Source for Your LiveAgent Chatbot Using FlowHunt Schedules

How to Connect Your Website as a Knowledge Source for the Chatbot

This article explains how to connect your own website as a knowledge source for your LiveAgent chatbot using FlowHunt's Schedules feature.

Overview

Your chatbot can automatically learn from your website content to provide accurate, up-to-date answers to customer questions. This is done through FlowHunt's Schedules feature, which periodically crawls your website and indexes the content for the chatbot to use.

Prerequisites

  • Access to your FlowHunt account (connected to your LiveAgent account)
  • The URL of the website you want to connect
  • Sufficient credits in your FlowHunt account to index the knowledge sources

Step-by-step setup

1. Access the Schedules section

Log in to your FlowHunt dashboard and navigate to the Schedules section.

2. Add a new schedule

Click the Add Schedule button to create a new schedule.

3. Select the crawl type

Choose the type of crawl that best fits your needs:

  • Domain crawl – Crawls your entire domain, learning from every page
  • Single URL crawl – Good for specific pages that need frequent updates or external sources
  • Sitemap crawl – Uses your sitemap to crawl pages more efficiently
  • YouTube channel – Indexes closed captions from your YouTube videos

For connecting your website as a knowledge source, select Domain crawl to index all pages on your website.

4. Set the crawl frequency

Choose how often the schedule should run:

  • Daily – Best for frequently updated content
  • Weekly – Recommended for most websites
  • Monthly – Suitable for websites with infrequent updates
  • Yearly – For static content that rarely changes

Note: Crawling costs credits. Consider how often your content updates and how important it is for the chatbot to have the latest information.

5. Enter the website URL

Input your website URL in the correct format:

  • For domain crawl: https://www.example.com
  • For single URL crawl: https://www.example.com/specific-page
  • For sitemap crawl: https://www.example.com/sitemap.xml

6. Create the schedule

Click Add New Schedule to create the schedule. The crawl will begin and show as pending. Refresh the page to see the progress.

How the chatbot uses this data

Once your schedule is set up and the website content is indexed, the chatbot will automatically use this information through the Document Retriever component. When customers ask questions, the chatbot will search your indexed website content to provide accurate, relevant answers.

Managing your schedule

After creation, you can:

  • View crawled URLs – Click Show found URLs to see all pages that were indexed
  • Check details – Click Details to see specific information about each crawled page
  • Re-run manually – Use the repeat icon to manually run the schedule anytime without waiting for the next scheduled time
  • Edit or delete – Modify the schedule settings or remove it if needed

Troubleshooting

If your schedule shows an error status, hover over the error tag to see what went wrong. Common issues include:

  • Invalid URL format
  • Website accessibility issues
  • Insufficient credits in your FlowHunt account

For more information about FlowHunt Schedules and knowledge sources, visit the FlowHunt documentation or contact LiveAgent support.