I am working on a POC of using Solr with SQL Server.
I have a very complex data model with SQL server which required a bunch of joining and scalar functions to strip some mark up and other stuff.
This is turning out to be performance bottleneck. To address this issue, we have considered NoSQL (MogoDB) or Solr with SQL Server as our options.
Using MongoDB, we attach a replication events for all CRUD operations which will carry over the data to MongoDB as well after successful insert, update, delete on SQL Server. And when we have to perform a search we do the search directly on Mongo Collections.
**This sounds very cool as we have 32 tables joining in this search, which can convert to 2 collections in MongoDB**
On the other note we are also exploring our options using Solr with SQL Server with DataImport
My concern is, based on this article http://www.codewrecks.com/blog/index.php/2013/04/29/loading-data-from-sql-server-to-solr-with-a-data-import-handler/ I have to do a import for each entity
- How does the joining search works with Solr? Should I import each table from SQL to Solr and then write a join login against Solr APIs
- Can I Import Multiple Entities at once? Should I create a view for expected result set (denormalized) and import that view to Solr?
- Will these imports have to be done on regular intervals? After a import
if there is new data, does Solr API reflects that change? or I have
to do a import first and then do the search against Solr API
Finally can I compare
Solr
with MongoDB
, if anyone has done this kind of evaluation please share your thoughts.
Asked by HaBo
(191 rep)
Apr 25, 2016, 05:47 AM
Last activity: Apr 25, 2016, 06:53 AM
Last activity: Apr 25, 2016, 06:53 AM