Sample Header Ad - 728x90

Database Administrators

Q&A for database professionals who wish to improve their database skills

Latest Questions

0 votes
0 answers
33 views
SSAS Processing Sanity check: Am I using partitions effectively?
I have been rebuilding a DataWarehouse and lately I have been focusing on decreasing processing times. This is using MS SQL Server 2022. I have been implementing partitions on some of the larger fact tables where appropriate, and am wondering if the way I am approaching partition processing is effec...
I have been rebuilding a DataWarehouse and lately I have been focusing on decreasing processing times. This is using MS SQL Server 2022. I have been implementing partitions on some of the larger fact tables where appropriate, and am wondering if the way I am approaching partition processing is effective. Here's my situation: I am running an SSIS job daily to do the following: 1) Truncate fact and dimension tables that will be reloaded with fresh data 2) Load Fact and Dimension tables with fresh data from sources 3) Rebuild indexes (>30% fragmentation, >1000 pages) 4) Reorganize Indexes (>15% fregmentation, >1000 pages) 5) Update Statistics 6) SSAS Process task - Process Full on the model This job has been taking longer than desired. I have been working on decreasing time by taking a more targeted approach to model processing. Rather than perform a "Process Full" against the model, I am trying the following in order: Fully process dimension tables, Fully process fact tables, Fully Process measure table, recalculate model. Two of my fact tables are not rebuilt daily, but instead have a daily dataset added each day(snapshot). I have decided to partition these tables as they are burdensome. My partitioning strategy is what I'm mostly hoping for critique of, though I'd be happy for thoughts on any part of my approach. To partition these snapshot tables, I have taken the following approach: I've created two database views for each table, latest and older. The views use a stored procedure as the date filter, which determines the last successful run date of this SSIS job. For instance, if my daily SSIS job has failed since March 8th, the "Latest" views would contain all snapshot data >= March 8th. "Older" views would include snapshot data < March 8th. My goal here is to avoid unnecessary processing of these snapshot tables due to their size. Each snapshot table in the model has two partitions: Latest and Older. I have configured my processing task to fully process only the "Latest" partition of these snapshot tables. Once a week I run a Process Full on the entire model to cover my bases. I am concerned about whether this is going to effectively keep my model up to date. It is performing admirably timewise, and I will be performing extensive testing to ensure data consistency, but am also interested if this is a coherent approach or if it's flawed. Thank you for any thoughts or advice you can provide!
Kad (1 rep)
Mar 11, 2025, 02:36 PM
0 votes
0 answers
44 views
Help with the complex DAX measure
I am struggling with a DAX measure. Here is the fact and 2 dimension tables I have: factAction(dimAccountID, dimActionID, Date, ActionQty) dimAction(dimActionID, ActionCode, ActionDescription) dimDate(Date, WorkingDay) The relationships are: dimDate.Date -> factAction.Date dimAction.dimActionID->fac...
I am struggling with a DAX measure. Here is the fact and 2 dimension tables I have: factAction(dimAccountID, dimActionID, Date, ActionQty) dimAction(dimActionID, ActionCode, ActionDescription) dimDate(Date, WorkingDay) The relationships are: dimDate.Date -> factAction.Date dimAction.dimActionID->factAction.dimActionID And I have a DAX measure for ActionQty = SUM('factAction'[ActionQty]) We want to report on the number of actions of ActionCode AAA. Fine, that's easy. Just select ActionCode = AAA into the Power BI filter. Now we want to also report, for those Accounts that had ActionCode AAA, the sum of ActionQty for ActionCode AAB where the ActionDate is within 2 working days of the AAA Action Date. Any suggestions would be welcome! Edit: added sample data and expected output | AccountID | Date | ActionCode | ActionQty | |---|---|---|---| | AC1 | 06-Apr-20 | AAA | 1 | | AC1 | 09-Apr-20 | AAB | 1 | | AC1 | 07-Apr-20 | BBB | 1 | | AC2 | 16-Apr-20 | AAA | 1 | | AC2 | 20-Apr-20 | AAB | 1 | So when Filtering for AccountID AC1 and April 2020, this new measure would return 0 (because the number of working days between the AAA and AAB ActionCodes for this AccountID is greater than 2) When filtering for AccountID AC2 and Aptil 2020, the new measure should return 1 (because the number of working days between 16 April and 20 April is less than 2)
Steve (21 rep)
Apr 13, 2020, 06:55 PM • Last activity: Mar 6, 2025, 09:32 AM
1 votes
1 answers
1378 views
Improve SSAS Tabular query performance
I'be build a Tabular Cube with FactTable and four dimensions. Fact table has about 2.500.000 of rows (and about 20 calculated measures on the cube level) and dimension tables from 5k-50k of rows. When I connect to the cube from Excel and try to add dimension I have to wait like 15-20 seconds for thi...
I'be build a Tabular Cube with FactTable and four dimensions. Fact table has about 2.500.000 of rows (and about 20 calculated measures on the cube level) and dimension tables from 5k-50k of rows. When I connect to the cube from Excel and try to add dimension I have to wait like 15-20 seconds for this. Do you know some ways to improve the query performance of such tabular cube (SQL 2014) ? enter image description here Performance issue not visible (Year consolidated), filtering by Port works fast: enter image description here Performance issue visible (Year expandeding to weeks takes about 15 sec, filtering by Port after next 15 seconds everytime): enter image description here
Tomasz Wieczorkowski (362 rep)
Dec 14, 2016, 04:14 PM • Last activity: Feb 15, 2025, 07:03 PM
1 votes
1 answers
1706 views
How to release SSAS memory without forcing restart
We have a production server with two SSAS instances, 1 for user querying and 1 with empty templates where we do new releases and (full) processing of the cubes (and then backup and restore processed cubes to main instance, and delete the processed cubes from the 'processing' instance). This is to pr...
We have a production server with two SSAS instances, 1 for user querying and 1 with empty templates where we do new releases and (full) processing of the cubes (and then backup and restore processed cubes to main instance, and delete the processed cubes from the 'processing' instance). This is to prevent any downtime on our instance used by clients. These processed cubes seem to be kept in memory, even though they are deleted after processing and backup. This processing instance holds no data as seen in ssms, but it keeps growing in memory (up to 40-50GB) untill it starts failing to process due to memory issues after a few days. About 95% of this memory is outside of the shrinkable/non-shrinkable data cleaner memory, so the memory limits are not doing anything to release this memory. After processing all cleaner memory drops to several 100 mb, while total memory usage for this instance stays high, and will keep growing untill we have failures. I don't believe the solution lies in any memory limits, since the used memory is not detected by the SSAS data cleaner. I have tested adjusting these memory limits, no effect. Doing a process clear before deleting the cubes also has no effect. The only thing that works is a manual restart of this instance every 2-3 days, but this is obviously not a proper/maintainable solution (automating this in a job step would require a proxy account with full admin rights on our production server, something we would like to avoid). All software is up-to-date (microsoft analysis server version 15.0.35.33), VertiPaqPagingPolicy = 1, server mode is Tabular and all cubes are in Import mode. I've been researching for a while now, but can't find the same issue anywhere, let alone a solution. Any help would be greatly appreciated!
DBHeyer (11 rep)
May 26, 2023, 08:04 AM • Last activity: Jan 19, 2025, 04:08 AM
1 votes
1 answers
75 views
Azure Analysis Service Tabular Cube - How To Impersonate A User in SSMS
Test Cube, Test User, connect to it(Azure Analysis Service) in SSMS, right click the cube, browse, click on "Impersonate": [![enter image description here][1]][1] And you will see a series of prompts that are specific to a on-premise implementation of AD, so local users, local groups(In this case it...
Test Cube, Test User, connect to it(Azure Analysis Service) in SSMS, right click the cube, browse, click on "Impersonate": enter image description here And you will see a series of prompts that are specific to a on-premise implementation of AD, so local users, local groups(In this case it's just my local system): enter image description here How do I select a user from MS Entra through these prompts? If that can't been done is there another easy UI driven way to impersonate a user? Did Microsoft just not extend support for this authentication type to SSMS(at least for Azure Analysis Services)?
David Rogers (215 rep)
Jun 25, 2024, 05:29 PM • Last activity: Sep 20, 2024, 01:20 PM
5 votes
3 answers
27260 views
SSAS Tabular: ImpersonationMode that is not supported for processing operations
I have a SQL 2016 SP1 SSAS Tabular Instance. I've deployed a model with the following properties [![enter image description here][1]][1] [![enter image description here][2]][2] When I try to process the database or a table I get an error **"The datasource contains an ImpersonationMode that is not su...
I have a SQL 2016 SP1 SSAS Tabular Instance. I've deployed a model with the following properties enter image description here enter image description here When I try to process the database or a table I get an error **"The datasource contains an ImpersonationMode that is not supported for processing operations"**. But if I change the impersonation info on the connection properties to use the service account instead of the current user it works fine. We also don't get this issue if we change the Default Mode to DirectQuery instead of import, but we need to use Import because we need to use DAX username function for row level security. I am an admin on the SSAS instance and also an admin on the SQL Server instance that is the data source. Why can't I process the SSAS tabular model as my user?
Adrian S (326 rep)
Feb 22, 2017, 02:13 PM • Last activity: Aug 8, 2024, 04:15 PM
3 votes
5 answers
12035 views
The column 'Date Offset' in table 'Date' has invalid bindings specified
I'm trying to deploy a tabular model to a server using the "Analysis Services Deployment Wizard". When attempting to deploy,I get the below error. > The JSON DDL request failed with the following error: Failed to execute XMLA. Error returned: 'The column 'Date Offset' in table 'Date' has invalid bin...
I'm trying to deploy a tabular model to a server using the "Analysis Services Deployment Wizard". When attempting to deploy,I get the below error. > The JSON DDL request failed with the following error: Failed to execute XMLA. Error returned: 'The column 'Date Offset' in table 'Date' has invalid bindings specified. The column in question uses the below calculation, which was found here INT([Date] - TODAY()) What should I look for in order to resolve this error?
Neil P (1294 rep)
Aug 15, 2017, 10:21 AM • Last activity: Dec 28, 2023, 05:57 AM
2 votes
2 answers
8183 views
How do I tell which Edition and Version my SSAS Instance is running?
In SQL Server on the database engine, you can run a SQL query like this to get the server's version info like Edition and Version, and Update Level, etc. SELECT @@VERSION What is the MDX equivalent for querying an SSAS (Tabular or Multidimensional) instance? Are there SSAS DMVs that can get me this...
In SQL Server on the database engine, you can run a SQL query like this to get the server's version info like Edition and Version, and Update Level, etc. SELECT @@VERSION What is the MDX equivalent for querying an SSAS (Tabular or Multidimensional) instance? Are there SSAS DMVs that can get me this answer? I browsed the Books Online page for SSAS DMVs , but I didn't notice any DMVs that could help me. DISCOVER_INSTANCES didn't seem to have the info. Let's assume that SQL Server Database Engine is not installed on this same server as the SSAS instance, so I cannot check this by querying the database engine. In my case, this is for a SQL 2012 Tabular instance, but would like to know how to query Multidimensional instance too.
John G Hohengarten (674 rep)
May 2, 2016, 04:30 PM • Last activity: Feb 24, 2023, 03:12 PM
2 votes
2 answers
2849 views
StorageEngineUsed set to TabularMetadata. For databases in this mode, you must use Tabular APIs to administer the database
I am an [administrator in SSAS][1]. I have many data warehouse servers. In some servers I have SQL Server and SSAS [on the same machine][2]. I [successfully backup the SSAS databases][3] and even [check if the backups are healthy][4] Recently, however, one of our servers which is managed by a third...
I am an administrator in SSAS . I have many data warehouse servers. In some servers I have SQL Server and SSAS on the same machine . I successfully backup the SSAS databases and even check if the backups are healthy Recently, however, one of our servers which is managed by a third party company based in Canada, needed to be replaced, and while migrating all the ssas databases from the old server to the new one this is the old server: enter image description here this is the new server: enter image description here I am getting this error message while processing the database in the new server: > This command cannot be executed on database 'DWCA' because it has been > defined with StorageEngineUsed set to TabularMetadata. For databases > in this mode, you must use Tabular APIs to administer the database. enter image description here
Marcello Miorelli (17274 rep)
Sep 20, 2019, 12:25 PM • Last activity: Jan 3, 2022, 02:00 PM
1 votes
2 answers
3669 views
The 'Database' with 'ID' = 'xxxxxxx' doesn't exist in the collection
I have a SSAS Tabular server with 30 database. I also have a scheduled process to backup all databases daily. When I run the process which is an SSIS package with script task using AMO, I get this error : > The 'Database' with 'ID' = 'nameofdatabase' doesn't exist in the > collection. The backup pro...
I have a SSAS Tabular server with 30 database. I also have a scheduled process to backup all databases daily. When I run the process which is an SSIS package with script task using AMO, I get this error : > The 'Database' with 'ID' = 'nameofdatabase' doesn't exist in the > collection. The backup process is running under the service account credential and it successfully backs up 27 databases but fails for only 3 of them. I checked those databases and don't see anything special on them. I googled the error message and most of the issues are related to deploying or processing the database. I don't see any reason for failing the backup. What is the problem and how I can resolve this issue?
user71787
Jan 2, 2018, 07:06 PM • Last activity: Sep 21, 2019, 03:28 PM
6 votes
2 answers
20059 views
SSAS Model Refresh - Not enough memory to complete this operation Error
We have started encountering an issue regarding the refresh of our tabular SSAS model. The tabular SSAS model has 38 tables within it. This process has been running without issue for over a year, however for around a month now, we have not been able to sucessfully process the tables within the model...
We have started encountering an issue regarding the refresh of our tabular SSAS model. The tabular SSAS model has 38 tables within it. This process has been running without issue for over a year, however for around a month now, we have not been able to sucessfully process the tables within the model. If i access the SSAS database > Right Click > Process Database > Select the mode to Process Default followed by OK, this is when the problem occurs. It will sit there for around 5 minutes before failing with the error messsage: Failed to save modifications to the server. Error returned: 'There's not enough memory to complete this operation. Please try again later when there may be more memory available. enter image description here enter image description here If i try and 'process' the tables individually, i recieve the same error message too. I have looked into the memory settings for SSAS, within the advanced window and have reset the values to their defaults. So the key values (as im aware of) are currently: enter image description here The server has been rebooted several times, we still have the same problem. Environment Details: Windows Server 2016 Datacenter SQL Server 2017 (RTM-CU9-GDR) (KB4293805) - 14.0.3035.2 (X64) SSAS Version: 14.0.223.1 Server Mode: Tabular Server Memory: 64Gb SQL Server Assigned Memory: 28Gb I have ready multiple articles online regarding these sorts of problems, however nothing seems relevant / useful so far. Any guidance / assistance would be greatly appreciated. *Disclaimer: I am not a BI / SSAS guy. Im just a DBA who has been given this problem to look at, so forgive me if i dont quite explain this correctly.*
grouchball (191 rep)
Mar 22, 2019, 11:07 AM • Last activity: Jul 23, 2019, 11:28 AM
2 votes
1 answers
777 views
SSAS Tabular model Date Dimension with Time possible?
I am in the process of building my first SSAS Tabular model and thought everything was going well, until trying to create a measure by DateTime. In my warehouse, I have a DIm_Time dimension, which has a DateTime column, with a row for every 5 minutes for the last 2 years. 5 Minutes is the granularit...
I am in the process of building my first SSAS Tabular model and thought everything was going well, until trying to create a measure by DateTime. In my warehouse, I have a DIm_Time dimension, which has a DateTime column, with a row for every 5 minutes for the last 2 years. 5 Minutes is the granularity that we require. I addition, there is a TimeID Identity column on the table. IN my ETL, I assign a Time ID to each fact table depending on what 5-minute range it slots into. So the end result, is a relationship between the 2 tables on the ID, with let's say 10 facts per TimeID. Now I am trying to do a simple count of rows per time range. For example, how many facts for the current hour. The problem I'm finding is that I am seeing no data in either PowerBI or Excel when testing my measure, and I'm 90% sure it's related to the fact that my model is filtering my DateTime as a date when calculating the measure. Fcts by Date:= CALCULATE ( COUNTA(Fct_Table[IDColumn]), Dim_Time[DateTime]) Please, can someone help point me in the right direction as I am struggling to find anything with regards to working with a DateTime dimension. Thank you very much. I am, using SQL/SSAS 2017
WadeH (540 rep)
Jun 5, 2019, 02:20 PM • Last activity: Jun 5, 2019, 08:07 PM
0 votes
1 answers
1082 views
Tracing tabular refresh errors
I have a tabular cube on an SQL Server 2017. The cube is refreshed with a full refresh command in a Server Agent job. The job runs on a schedule. When I check the history of the job I see that it went through successfully, so everything is "green". When I check the tables of the cube one of the tabl...
I have a tabular cube on an SQL Server 2017. The cube is refreshed with a full refresh command in a Server Agent job. The job runs on a schedule. When I check the history of the job I see that it went through successfully, so everything is "green". When I check the tables of the cube one of the tables is not processed well, it does not contain any data. Now with a manual processing of only this table the problem is solved. So it is not a table config/definition issue. Where can I find more information about error message why this table is not processed in the automated full refresh command?
Zsombor Zsuffa (3 rep)
Jan 25, 2019, 09:44 AM • Last activity: Jan 25, 2019, 04:46 PM
1 votes
1 answers
969 views
Dynamic Partition on tabular model 1200 using visual basic 2015 in visual studio
I am using visual studio 2015, sql server 2016 and tabular model 1200. I am trying to create a dynamic partitions on the existing cube by creating SSIS package using script task and visual basic coding. So can any one specify the approach to create a dynamic partition every month and process the cre...
I am using visual studio 2015, sql server 2016 and tabular model 1200. I am trying to create a dynamic partitions on the existing cube by creating SSIS package using script task and visual basic coding. So can any one specify the approach to create a dynamic partition every month and process the created partition every night. I searched through online and I didn't find any example with this environment.
Harison (11 rep)
Feb 10, 2017, 04:17 PM • Last activity: Dec 23, 2018, 01:01 PM
0 votes
1 answers
1117 views
Link DAX OpenQuery Output to SQL Server Temp Table
I want to record measures from a Tabular model, as KPIs, and store the values in a table in SQL Server. I have created a linked server from my SQL Server instance to my SSAS instance. I have written a stored procedure to execute the DAX code via `OPENQUERY`, with the intention of storing these in a...
I want to record measures from a Tabular model, as KPIs, and store the values in a table in SQL Server. I have created a linked server from my SQL Server instance to my SSAS instance. I have written a stored procedure to execute the DAX code via OPENQUERY, with the intention of storing these in a temp table, before loading them in to the KPI table. I am using a temp table because I am querying multiple tabular models. My problem occurs when I try to update my temp table with values from my OPENQUERY output. My OPENQUERY output is currently within a CTE, and I was hoping to do a simple join to the temp table, but because the output from the DAX query returns each column name within [ ], when I try to join on one of the OPENQUERY columns I receive the error "Invalid column name...". E.g. UPDATE temp SET temp.[Current Contract Count] = cte.[Contract Count] FROM #ServiceZoneKPIs AS temp INNER JOIN tabular_cte AS cte ON cte.[Copy of Service Zone Code] = temp.[ServiceZoneAlternateKey] The error occurs because 'Copy of Service zone' does not exist in the OPENQEURY output; the output column name is [Copy of Service Zone]. I may well be missing a simple trick here? How can I join an OPENQUERY output, returning tabular model data, to my T-SQL temp table?
DimUser (382 rep)
Jul 2, 2018, 02:14 PM • Last activity: Jul 2, 2018, 02:43 PM
1 votes
1 answers
1122 views
How to move SSAS tabular instance databases?
We have a SQL Server 2012 SSAS tabular instance in our environment that is sitting in domain1. We need to move these SSAS databases in a different SQL server 2012 that is on domain2 which has no trust established with domain1. This is the first time I am tasked to move SSAS in to another server. Wha...
We have a SQL Server 2012 SSAS tabular instance in our environment that is sitting in domain1. We need to move these SSAS databases in a different SQL server 2012 that is on domain2 which has no trust established with domain1. This is the first time I am tasked to move SSAS in to another server. What would be a good approach in moving these SSAS tabular databases? Is a normal backup and restore would do the process? Or does it have to be attach detach process? Also, when I migrated our SQL Server databases, I have to adjust each account logins since there is not trusts between domains. Do I have to do the same thing with SSAS in this case?
Keith Rivera (617 rep)
Jul 16, 2016, 08:14 PM • Last activity: Mar 17, 2018, 06:19 AM
0 votes
1 answers
403 views
Multi condition DAX row filter
I am trying to convert the following SQL row level security function into a DAX filter within a Tabular model CREATE FUNCTION [Security].[fn_securitypredicate](@BrandID AS INT, @ChannelId AS INT) RETURNS TABLE WITH SCHEMABINDING AS RETURN (SELECT 1 AS fn_securitypredicate WHERE (EXISTS ( SELECT 1 FR...
I am trying to convert the following SQL row level security function into a DAX filter within a Tabular model CREATE FUNCTION [Security].[fn_securitypredicate](@BrandID AS INT, @ChannelId AS INT) RETURNS TABLE WITH SCHEMABINDING AS RETURN (SELECT 1 AS fn_securitypredicate WHERE (EXISTS ( SELECT 1 FROM security.RLSStaffBrand WHERE StaffUsername = SYSTEM_USER AND BrandId = @BrandID) AND EXISTS ( SELECT 1 FROM security.RLSStaffChannel WHERE StaffUsername = SYSTEM_USER AND ChannelId = @ChannelID) ) OR ( EXISTS (SELECT 1 FROM security.RLSStaffBrand WHERE StaffUsername = SYSTEM_USER AND BrandId = @BrandID) AND NOT EXISTS ( SELECT 1 FROM security.RLSStaffChannel WHERE StaffUsername = SYSTEM_USER ) )-- this user is not restricted by Channel OR (NOT EXISTS ( SELECT 1 FROM security.RLSStaffBrand WHERE StaffUsername = SYSTEM_USER) AND EXISTS ( SELECT 1 FROM security.RLSStaffChannel WHERE StaffUsername = SYSTEM_USER AND ChannelId = @ChannelID) ) ) GO So far I have the following DAX filters, but this only handles the first condition in the SQL code. I don't know if it's even possible to replicate the rest in DAX. ='Brand'[BrandId]=LOOKUPVALUE('RLSStaffBrand'[BrandId], 'RLSStaffBrand'[StaffUsername], USERNAME(), 'RLSStaffBrand'[BrandId], 'Brand'[BrandId]) ='Channel'[ChannelId]=LOOKUPVALUE('RLSStaffChannel'[ChannelId], 'RLSStaffChannel'[StaffUsername], USERNAME(), 'RLSStaffChannel'[ChannelId], 'Channel'[ChannelId])
Adrian S (326 rep)
Sep 13, 2017, 02:00 PM • Last activity: Sep 13, 2017, 10:45 PM
1 votes
0 answers
603 views
How make an SSAS Tabular model case sensitive?
I need to make my tabular model case sensitive. Currently my tabular model is case insensitive, how do I change my existing cube to be case sensitive?
I need to make my tabular model case sensitive. Currently my tabular model is case insensitive, how do I change my existing cube to be case sensitive?
Neil P (1294 rep)
Aug 10, 2017, 09:30 AM
1 votes
1 answers
1436 views
SSAS Tabular Cube partitioning - number of partitions
We ha a Fact table with 500 milions of rows (data from 10 years - 60GB). I've partitioned this table on database level and SSAS Tabular Cube level as well - one month per partition to improve the overall browsing performance and SSAS Cube processing time. But last time I've heard that configuring mo...
We ha a Fact table with 500 milions of rows (data from 10 years - 60GB). I've partitioned this table on database level and SSAS Tabular Cube level as well - one month per partition to improve the overall browsing performance and SSAS Cube processing time. But last time I've heard that configuring more than 2 partitions for SSAS Tabular cube is not good. There should be max 2 partitions per tabular cube. Have you heard about something like that ? Is that true ?
Tomasz Wieczorkowski (362 rep)
May 19, 2017, 09:09 AM • Last activity: Jun 1, 2017, 07:34 PM
2 votes
1 answers
1207 views
SSAS Tabular QueryMode Confusion. Why Hybrid mode is useful?
I read couple of articles about QueryMode in Tabular. I finally noticed that hybrid modes (`DirectQuery with In-Memory` and `In-Memory with DirectQuery`) are NOT useful for my case. Actually they are not working the way I like. We have huge Tabular databases and we use In-Memory for all partitions a...
I read couple of articles about QueryMode in Tabular. I finally noticed that hybrid modes (DirectQuery with In-Memory and In-Memory with DirectQuery) are NOT useful for my case. Actually they are not working the way I like. We have huge Tabular databases and we use In-Memory for all partitions and concern is we run out of memory at some point. My thought is, if I could use DirectQuery for old partitions and In-Memory for recent partitions (newer dates), this would help us to have fast performing reports for recent dates and keep data for recent dates in RAM and leave older data which are not accessed often to use DirectQuery. Does my thought make sense? Is there any way we use Tabular databases this way? Thanks for sharing your thoughts.
user71787
May 18, 2017, 11:51 PM • Last activity: May 30, 2017, 10:53 PM
Showing page 1 of 20 total questions