The ideal tool to slice and dice your data.
They can be formed from dimensions, measures and calculations. They can use top n, bottom n, wildcard matching, type in and multi-select. Far more powerful (at least from a basic sense), than parameters, these filters really bring your visualisations to life.
It's true, but these are amongst the biggest culprits for terrible data performance. Yet another one that fails to convey the impact to performance, you see, in order to build a quick-filter, Tableau needs to execute a Select Distinct on the entirety of the column, it matters not whether the table contains a thousand rows, or one-hundred trillion, it still needs to perform this. Net result, as many users routinely plug Tableau into tens and one-hundreds of millions of rows of data, do you have any idea how much processing just one quick-filter takes to build.
And then, the source-engine must start again for the next quick-filter. So if your table contains 10M rows and you have just five quick-filters, the data engine must run this query once for each filter, touching every record in the set:
Field_1 As Quickfilter_1
Group By Field_1
So essentially, the data engine will have touched 50 million records from a 10 million record set. Now do you understand?
There are however, two type of filter that don't touch any records:
- type-in allowing users to type their filter:
- Although you must be very careful with this as this results in a full-text search to filter the records which can be very slow - imagine string matching a partial index to 10 million records.
- The trusty parameter:
- A static list that the developer must maintain, having this as a multi-select is just about the only thing that would truly make this but alas, being only single-select is a major turn-off for most users and for good reason too.
I can offer two further solutions that will enable yo to go crazy with your filters: (1.) Either use a parameter to filter your initial set or (2.) Have your filters on just a single page
(1.) The quote I put at the top of this page regarding needing to re-build the r&d report, well, part of the problem was that this was plugged into a 2 billion row set with around 8 quick filters that needed to be generated causing huge slow-downs on what was already a 650ms network lag.
However, the r&d entries were all Jira tickets, and frustratingly, users were only interested in a single ticket id each time they were using the report, with the average data size per ticket being around 3 million rows. Ok, so part of the back-end rebuild I mentioned was to partition the data by ticket id which hugely improved performance but, from the front-end, simply converting the connection to CustomSQL with a type-in parameter for the where-clause (this was eventually populated over a url-call) resolved the problem.
How this worked was...
.. Tableau will not attempt to render any item including quick-filters without data, trouble is, an empty parameter meant that no data was being served which means, the report had a chance to open and render (as nothing but headers/footers titles and parameters would still be drawn), and then, once the user enters their ticket id which filters the data from the outset down from 2 billion rows to 3 million rows, the quick-filters are then built using the same Select distinct only against a considerably smaller set.
Therefore, by using CustomSQL and a parameter to reduce your set to that which is of interest from the outset, your quick-filters can be built against a much smaller set.
(2.) The second alternative I can offer cam about from a pure accident:
A second major drawback of the quick-filter is that if the same filter is placed on more than one dashboard or chart page but, enough time passes between initially building the filters on page/dashboard 1, where other actions have taken place enough to clear the cache for the filters, when moving to the alternate pages, the quick-filters must be re-generated with the same Select distinct query.
I provided some consultancy to a company who just because they had 20 filters which took-up huge amounts of screen-space so, to save on space, we created a page of filters; it was awkward for the end-user to have to go back to the filters page when they wanted to adjust something, and I had to create an indicator to sit on every page to flag when and which filters were applied, but, the users were happy with it.
Moreover, by moving the quick-filters to the one page, this meant that despite their number, they would only have to query the data just once per session, which allowed for an improved experience.