The keyboard shortcut (Option ⌥ + B) might no longer be working by default due to the recent update in the Chrome extension platform.
This should only affect users who installed the Chrome extension before November, 2022.
You can always create (or customize) a shortcut to launch Bardeen in the extensions settings.
Copy this link into your browser:
chrome://extensions/shortcuts
Then set the shortcut:
Try launching Bardeen with your shortcut on this page!
It's not possible right now to change your email or transfer your Playbooks in bulk.
But it is possible to transfer them one by one. There are 2 ways.
1. Share the playbooks from one account to the other. You can do this by having 2 chrome profiles.
2. Or export/import the playbooks from the Settings page
The biggest limitation is you won't be able to edit the scraper templates from shared playbooks.
Yet, we know this is not the ideal workflow, and we can check for a "change email" feature. You can upvote this feature here.
Let's say you want to get data starting from the 2nd or 3rd item you're scraping, not the first.
You can do this by using the "Get slice of array" action.
This is an example from a Playbook that scrapes a list of people on LinkedIn.
By using the "Get slice from array" action, you can specify which rows you want to get from a Scraper's column.
If you need to do this for multiple columns, the current work around would be to use a "Get slice" action for every column, and do field mapping when saving the data to your database.
Yes, LinkedIn can detect irregular behavior and warn you (or suspend your account) for inappropriately using certain automations. You might get a warning like this first.
To avoid this, we don't recommend scaping large lists in short amounts of time.
You can modify your playbooks/autobooks to add delays when every page is opened, to minimize this from happening.
Here's more information on this:
For your shared playbook to be updated, you need to unshare your playbook and share it again. This will generate a new link.
To do this:
1. Edit the playbook.
2. Unshare it (click on share, then the “unshare” button).
3. Share it again
You can do this with the "Update or add action", which will check if a page is already created and updates it, or creates a new notion page if not.
To use this, you need to use the "Find notion pages" action before, and relate it to the "Update or add" action.
You can modify the "When scheduled event occurs" trigger by using commands. One of the suggested options is running it only on week days.
This is possible! Edit your automation and add the "Convert table to CSV" action on the end.
You'll need to input a table (e.g. a table with scraped data) and that's it! You should get the CSV once you run the playbook.
If you have a single page scraper (like scraping a Linkedin profile) and run it on a list of profiles, you can do this by using the "Scrape data in the background" action.
Here's a demo of how that can be done.
Yes. Since Bardeen lives on your browser, Playbooks like Enrich LinkedIn links from a Google Sheet essentially open the LinkedIn profiles in the background and scrape their information.
This happens from your currently opened LinkedIn account, so people can see you've visited their profile.
You can re-use any playbook that's on Bardeen by
1) Opening it on the Builder and editing it, or...
2) Importing the playbook to the builder.
Check this video on how you can do this.
Pages like Facebook are highly dynamic, so some fields (like getting the phone from a Facebook page) can be located differently from page to page. That's why even if you selected a field, it might be getting mixed results.
In this case, we advise checking the existing playbooks if there’s one that’s already built to scrape that page. Here’s a tutorial on how you can modify a playbook from the existing ones.
If you don’t find a playbook for the page you need, you need to build a scraper template with advanced CSS selectors to reliably get the fields every time.
This takes some technical knowledge, so in these cases, we support you by building a playbook that has those advanced selectors included.This is a manual effort that can take some days. Let us know your case on the Slack Community and we'll help you build a more reliable scraper for your case.
On the field mapping, try selecting one field, then appending with a comma "," and clicking the input again to select the second field.
Another workaround is using the "Merge text" action to append the fields you need, and then link those results in the database action.
Yes, it's possible to build a model to scrape OpenSea.
Yet, scraping OpenSea collections unfortunately is difficult right now, as their website implements dynamic changes on scroll, to avoid scraping.
This causes unexpected behaviors that limit the number of results with Bardeen's scraper.
This is an issue we have mapped, but with no estimated time for fixing.
You can get more details and follow up on this issue at this Canny ticket.
Yes, you can use the "Ask me every time" command to customize a message every time you run a Playbook.
This is useful in use-cases like Gmail messages or Slack.
You can also use the “Merge texts” action to build dynamic texts that can combine fixed messages, data from past actions, and custom messages.
Check the image for this playbook example.
Unfortunately, this only works for playbooks that are triggered on-click. If you set an Autobook with "ask me every time", you will only be able to set those values once.
We’re considering having Autobooks also pop up to "ask you every time". This is a feature request you can upvote, comment and follow up on this Canny ticket.
We don't have that feature available.
It is proposed by our community on Canny. You can upvote it here.
A common workaround is duplicating a playbook, with all its actions, and editing the new playbook from there.
You can do this by editing the trigger "When schedule event happens", so it starts on a particular day.
This example triggers on Sunday at 9 am.
For more information, check out this article on how to use recurring tasks and customize them your own way.
You can only send URL type of files (like image URLs) to these databases fields, we do not support sending screenshot files yet.
A workaround in these cases is to add the screenshot file to google drive and add that file's URL into your database.
We have plans to make this available later on. You can upvote and be in the loop for this feature request on Canny.
Adding files (images or pdfs) into Airtable Database
We plan to have a Freemium model.
This is our current plan, yet, things might change as we learn more.
Our mission is to bring automation to everyone. This means that we'll always have a powerful free plan, especially for our early users!
You can build the JSON format using the Merge Texts action this way.
This should work to send multiple variables too. For more context, check this conversation from the slack community.
We're working on a feature called String Templating, that will allow users to build a string from multiple variables, more intuitively.
You can follow up and vote for that feature here: https://bardeen.canny.io/features/p/string-templating
When you integrated Notion you might have not selected all the pages in your workspace.
In this case, check the connections button and make sure that the database has the Bardeen connection added.
You can also delete the Notion integration from the app Settings and integrate it back again, selecting the databases you need.
We only support entering a static input on a scraper template, not dynamic values.
For instance, you can build a scraper template for a LinkedIn profile that's able to connect and send the same message every time.
But in this case, you can not customize the message with dynamic values.
Supporting dynamic inputs is a feature that you can upvote at this Canny post: Form Filling: support for dynamic input
You can check our current integrations at our integrations page and request new ones from our Canny suggestions.
In this case, we don't have an integration with Webflow.
Yet, the easiest way to send or receive data from unsupported apps is to set up a webhook on tools like Zapier or Make, and set up a Bardeen Playbook/Autobook using the "Send HTTP /post/get" actions.
This blog might come in handy https://nocodequest.com/webflow-webhooks-with-integromat/
Right now we don't have a solution to paginate in pages like this one.
What you can do in those cases is to run the "Scrape data on active tab" on each page and paginate manually.
Yet, this is a feature we have mapped as a product request, since it happens on multiple websites.
Check this Canny ticket to see other work arounds on this problem.
Use the "When schedule event ocurrs", so it triggers daily at a specific hour of the day.
This trigger will allow you to monitor the data on a URL or a list of URLs.
Running Autobooks via cloud is something that we may support in the future.
Currently, Autobooks are only triggered while you are online.
You may upvote this feature and get updates on this Canny Ticket (our public roadmap)
It can be done, mainly thanks the “Open link pattern” action.
This video shows an example on how you can search Google, Linkedin and Quora.
This is possible on Notion with the following actions:
1. Find a notion database - you can search by title or fields.
2. Scrape data on the background on URLs- the URL’s from the pages found in step (1)
3. Update or add notion page - set to update the page found in step 1.
This playbook provides the structure of the playbook.
https://bardeen.ai/s/SjoXZnpGGTtt
⚠️ Needs to be customized to work.
This workflow is not possible currently on other database apps like Airtable or Coda.
That's right, not all Playbooks run out of the box. Some need to be configured with each user's inputs in order to run properly.
Good thing is there's an article for each playbook on the catalog, that guides you though the needed inputs or changes to run it successfully.
You can find it by searching the Playbook’s name at bardeen.ai/playbooks.
For example: If "XYZ" exists then "scrape data XYZ" else "populate table field XYZ with EMPTY".
In the scraper there are no conditionals. Yet, they are not needed, since most conditionals can be covered when you send the data to a database (Notion, Airtable, Coda).
For instance, you can Scrape active tab.
1. Do a conditional: if step 1's field XYZ is empty
2. Add data to Notion with the XYZ field as a fixed text "Empty"
3. If not empty... Then Add data to Notion with the XYZ field's result