Tuesday, April 19, 2005

Integration Install Part 1

Well, I didn’t have time to write an original post yesterday, but I did post an interesting article on about Siebel Systems. At least I thought it was interesting, especially the part about them grooming themselves for a possible sale… especially since the name Microsoft was mentioned.

I was working on the Integration piece for my MSCRM and MS Great Plains for the last three days. I can’t wait to see the consulting bill on this one. It took a lot longer then they anticipated, and longer than it should have.

I provided them with the entire set of production dbs, well over 25 GB of data to test the integration with. I copied them over to an external HD and they came and picked it up back on Feb 22.

The installation architecture consisted of a SQL Server with a CRM db and a GP db on one box, and a SQL Server on another box that houses the Integrator, the idea is to have those on separate machines to help load balance. And even though this was the plan from the beginning the consultant only tested the integration on a single machine and a single SQL Instance. Needless to say there was some inconsistencies, and it had to be redone a couple of times.

Testing and evaluating are important factors on any install. One of the key points to evaluating a system is to ensure that as much of the system as possible is reproduced on the test server or in the test environment if more than one server is being tested. The ultimate would be to test it on the exact same hardware, but for the most part that’s not feasible. If you plan on running a piece of software on a server that runs enterprise Antivirus, you better be sure to put it on there when you test. If a consultant or VAR is testing something for you be sure to push them to get the environment as close as you can. Its your money and testing can reveal a lot of problems and lets you try things several times before paying three people to sit and troubleshoot an install issue that should have been identified in testing.

Anyway, our last integration piece was the default BizTalk model that was released 6 months after CRM came out of beta. And like I’ve said here before, it worked – but just barely. It was slow, and required more resources than necessary. And when large updates were pushed through like the company wide price increase we had last year it broke in the middle and had to be left behind. Literally. It took us weeks to fix it, upgrade to 3 GB of RAM, then another 2 weeks to force feed the info through the sausage grinder. We had to break it down into bites and feed it only at night. Feeding it during the day slowed the production servers that ran GP down so much that we couldn’t post invoices or run our BOM inventory auto builds.

The initial integration took about 15 hours with that monstrosity. Once the tedious installation took place, Scribe Insight shattered that time, even though the database had grown considerably since the initial go around 2 years ago. I’m not exactly sure how long it took, because It was done the first time I checked on its progress 5 hours later. In fact it was done so quickly I initially thought it had failed until I saw that it had update or integrated 25,000 records in the monitor.

I’ve noticed some periodical slow downs, and some minor problems on a few records, but for the most part I feel pretty good about the integration. The true testing starts tomorrow when the consultants come back to look through the results, and train me a little further on the GUI. I’ll keep you posted on the results.

Thursday, April 14, 2005

Slugs not as slimey as you think

I finally got to finish up on a project I've been working on for a while and I thought I'd share it.

My company has a unique situation where the MFG Sales Reps are not employees of the company. And as of right now they do not have access to our proprietary databases or CRM. Therefore they are not registered users of the system.

This complicates things because in some regards we would like them to act as users. We have not reached the point where we can justify buying an additional 45 or so license. Turning them into accounts that we link as sub-accounts to the accounts that they would own is a cumbersome work around, but the best solution we have right now.

We would like to generate a Lead Notification & Follow-Up form that is emailed to the appropriate rep when a lead is pseudo-assigned to them. I say pseudo-assigned because, when a lead is generated, we cannot really assign it to a Rep in MSCRM, since only a user can be an owner. So the assignment is theoretical. Instead we assign it to the Rep’s Regional Sales Manager (RSM). After interviewing an RSM, I determined that not all leads will be pseudo-assigned to a REP.

I decided the best way to handle this is through workflow. I originally attempted to create an email template in MSCRM. The MSCRM email template has certain limitations, and the biggest is that the email template cannot be sent to anyone other than the email address associated with the MSCRM lead.

Swing and a miss…

My next idea was to use the create an email as a workflow action. However there was no obvious way to skim information off the Lead objects and put it in the body of the email.

After discussions with Microsoft, I discovered that we could create an email in workflow and use xml slugs in the body, we could also use HTML format the email. This worked perfectly for the Lead Notification. I haven’t tested it anywhere else yet, but I am fairly certain that this can apply across the board.

The slugs are pretty simple to understand, and the information to create them is readily available in the Schema Manager. You can get to the Schema Manager in the deployment manager on the CRM server.

The basic format for a leads first name is: &lead.firstname;

There is one hiccup in this system and that is when deploying a custom picklist from the schema only the integer is returned as the value in the picklist, not the string that the integer represents.

By using this basis of our Lead notification system we have created the following procedure. When a batch of Leads are entered, the data entry person can the select the leads and apply the LEAD NOTIFICATION AND FOLLOW UP rule. The system will automatically generate an email that includes contact information and follow up questions. The Email is sent to the owner of the record, in most cases the RSM. The RSM can then evaluate the Lead and make any further notation he deems necessary and forward it to the appropriate Rep. The rep can then fulfill the Follow-Up requirements, and email it back, since it is a Registered CRM EMAIL when the rep replies to the email it is already associated with the lead, making it that much easier for the RSM to qualify or disqualify the lead.

Not perfect, but a half decent workaround.

Wednesday, April 13, 2005

CRM Integration

On of the biggest advantages of MS CRM for UECORP was that it fit so well with our current Great Plains Solution. The integrator that came with it was based on the BizTalk Model, and while powerful, it was lumbering recourse hog. It might have been fine in a smaller environment, but with a 12 GB Sales database, we were constantly running into buffer overflows crashing the Queues in the integrator. We knew we would be looking for a better solution to the problem in the next year or so, but when Hurricane Ivan dumped 12 feet of water into my server room, we decided the future was now.

After months of searching for the right solution and looking for the right balance of speed and power, we settled on Scribes Insight solution. And the installation will Begin tomorrow. This selection process itself was not with out painful lessons.

Does anyone remember back at the turn of the century (1999-2000) when white papers were technical tools written by Software Engineers and Technical Writers? That’s not the case anymore as these things have become just more marketing fodder corrupted by the almighty sales push. You can no longer read a white paper with out seeing marketing sound bites like “much easier to install than other solutions” and “More powerful database Engine” these phrases wouldn’t be a sore spot for me, except that they often fail to backup these claims with any technical proof.

The main problem I found with Scribes white papers was that in the white paper for the MSCRM to Great Plains Connector was that it stated that certain add-ins or modules were part of the package, and only after the purchase did I find out that they were not, the financial impact of purchasing these modules separately and adding them to the package was significant, but not enough to push me in another direction. It did leave a bad taste in my mouth though. I really like the Scribe Dashboard and the product seems to be fast easily customizable, and much simpler to work with than the default Integrator that came with MSCRM.

They offer some training with the product to get you familiar with it. The don’t document in the training information that you need to have the product installed to get anything out of the training. So even if it’s just in a test environment, make sure you get it installed first or you’ll be behind the eight ball when the lessons start.

I’ll provide some more information after the install on my overall feel of the product, and document some of the pitfalls and sore spots we find while going live.