OTDS FAQ for Documentum

If you work with Documentum, sooner or later you’ll have to face that moment were you’re going to have to use OTDS in your system (as this is mandatory since Documentum 23.4). I will try to provide some insights and tips about OTDS and Documentum in this post.

OTDS is basically an IDP (an authentication/access control system) that OpenText created (having in mind “other” solutions more limited than Documentum) and that they have been pushing to many of the products of their portfolio.

What are the positive things that OTDS brings to the table when we are talking about Documentum?

  • Centralized authentication management (as in: configure you authentication in 1 place, and reuse for all components, removing the need of having different libraries/configurations in webtop, d2, rest, etc.)
  • Centralized user/groups management: I just don’t buy this, because it relies on companies having a “perfectly organized LDAP/AD” which I’ve never ever seen. Even worse if we include here application-level group/roles, where in Documentum you can have group structures defined that are years away for any existing configuration in any AD/LDAP (and I do not see anyone manually creating thousand of existing Documentum groups in OTDS).
  • Centralized licensing management (another push from other products, we will see how this really works, as I already expressed my concerns in a previous post)

Obviously, not everything is fantastic, and the are several topics you should be aware of:

  • An authentication system is totally outside the scope of ECM departments, meaning that no ECM expert is capable of properly evaluating, configuring or maintaining a system of this kind (have a talk with your cybersecurity/authentication team before doing anything!). Not only that, this can conflict with existing policies in your company regarding authentication policies.
  • OTDS is not a product (check OT’s product page) but it is considered a “component” of other products. What does this mean?
    • You’re using a critical (as it is what it is supposed to handle access to your data) product which is not “recognized” even by their own vendor
    • OT has a leader solution in this field, NetIQ, which came with MicroFocus, so it is not clear what their roadmap is regarding IDPs.
    • There’s no “product support” for OTDS, but this is delegated to other product teams (ie: Documentum support). Obviously, they have no idea about OTDS itself, and OT bureaucracy makes highly complicated to get an answer when you have an issue.
  • OTDS is a single point of failure: OTDS doesn’t work -> No user can work, even if everything else is up and running.
  • OTDS was conceived for other, much simpler, OT’s products. As Documentum is kind of a Swiss army knife, OTDS greatly limits existing DCTM configurations (which makes this integration a challenge in certain environments)

So, given these topics, what scenarios can we found when integrating OTDS?. Well, it depends on your existing system. I think most systems can be grouped in three different categories:

  • Highly regulated / complex systems: You have your own IDP (EntraID, NetIQ, etc.), you also have your own system to handle access to Documentum (ie: automatic creation of users on repositories). This also includes multiple repositories in the organization, and many applications with many groups.
  • Small installations: Single repository approach, not many users, not many groups, still using user/password login
  • New systems / upgrades to a “clean” system

Based on this, what is the best (or only) approach to integrate OTDS in these scenarios?

  • Highly regulated / complex systems: Forget about documentation. You do not need resources, access roles or anything. Just use a synchronized partition with the required oauth clients and delegate authentication to your existing IDP. Minimal configuration, minimal maintenance (other than getting this to work). OTDS here acts just a proxy for getting the OTDS token that will be used by DCTM applications.
  • Small installations: Ideal scenario as you’re using Documentum as some of the other, more limited products from OT, so this is what OTDS was originally intended for. Probably you’ll only effort will be manually configuring groups.
  • New systems / upgrades: You “should” try to use OTDS in the “expected” way. Be aware of several limitations coming from the lack of support for existing solutions in Documentum:
    • Multirepository configurations are a nightmare. Nobody seem/wants to understand that you can have different users, in different repositories, and this can be a challenge.
    • Mix of functional/inline accounts and users accounts can be a challenge as well.

Finally, some tips that you should consider when using OTDS:

  • As soon as it is configured, add you organization users responsible for OTDS to the admin group, disable the otds.admin account and force 2FA authentication (and manually remove the user/pass login from otds-admin). You do not want to expose the admin page to anybody (even less if you have this running on a public cloud) as this is a huge security risk.
  • Token TTL is a topic when dealing with timeouts. Until now you only had to worry about web application timeout, but now also the OTDS token can expire, so the TTL should be something like TTL=timeout+buffer of time, so if a user stops using an application after 1 hour, and you have defined a 4 hours timeout on the application, your token needs to be valid for at least 5 hours.
  • When configuring D2, completely ignore the documentation. By no means mess with the tickets (who thought of this approach? who allowed this to be done??) or perform the configuration part that tells you to enable the “Global Registry encryption in other repositories”. This is no longer required since 23.4 P09 (you’re welcome) as it is a huge security risk (and I’m quoting the talented engineering team here: “a design flaw”, but they still seem to have forgotten to remove that section from documentation).
  • Make sure you test all scenarios before going live or you’ll be in trouble, as fixing these “live” will be challenging:
    • Web interface authentication
    • DFC interface authentication
    • Token authentication
    • OTDS token generation from external authenticator token
    • Any existing user configuration authentication (LDAP, inline, functional accounts, direct url access to components, etc.)

OpenText vs C2

All of us that work with Documentum (and I think in general with OpenText) are used by now to the lack of resolution provided by the support team (as in: it takes months/years to fix something, even if you provide the solution to them) but sometimes you come across situations that are really ridiculous and give the impression that OT and their products are just amateurs/juniors trying to look as if there’s someone competent behind.

Today we came across a reported issue with D2 where a recurring error was thrown in the logs when running a C2 transformation:

java. lang.ClassCastException: class java. lang.Boolean cannot be cast to class java.lang.String (java. lang.Boolean and java. lang.String are in module java.base of loader 'bootstrap')

This error is really weird because this looks like an error with casting types. After searching for the class with the issue, we found the code on the PDFUtils class:

 if (parameters != null) {
        Iterator iterator = parameters.entrySet().iterator();

        while(iterator.hasNext()) {
           Map.Entry e = (Map.Entry)iterator.next();
           String paramName = (String)e.getKey();
           String paramValue = (String)e.getValue();
           transformer.setParameter(paramName, paramValue);
        }
  }

In this case, the issue happens when paramValue is retrieved and (forcibly) casted as String. Not believing what we are seeing, we decide to modify the class to check what parameter is being retrieved, and if it is actually a String (which it is obviously not, as it is throwing a ClassCastException :D):

EffectiveDateLabel
true
DocumentNameLabel
true
ApprovedDateLabel
true
PageTitle
true
ApprovalCategoryLabel
true
ApproverNameLabel
true
DocumentStatusLabel
true
featureSecureProcessing
false

As you can see, the error happens when a not-String parameter (featureSecureProcessing) is retrieved and casted to String, throwing the earlier error. At this point, anyone will think something like: “Well, another shitty code by OT where someone has added a new parameter which is not a String and (as usual) nobody is doing any testing (besides end-users, of course)”. However, in this case is even worse than this, because if we take a look 4 lines before that while loop:

boolean featureSecureProcessing = true;
if (null != parameters && null != parameters.get("featureSecureProcessing")) featureSecureProcessing = ((Boolean)parameters.get("featureSecureProcessing")).booleanValue();

s_transFactory.setFeature("http://javax.xml.XMLConstants/feature/secure-processing", featureSecureProcessing);
Transformer transformer = s_transFactory.newTransformer(new StreamSource(xslfoInput));
transformer.clearParameters();
if (parameters != null) {
  Iterator < Map.Entry > iterator = parameters.entrySet().iterator();
  while (iterator.hasNext()) {
    Map.Entry e = iterator.next();
    String paramName = (String) e.getKey();
    String paramValue = (String) e.getValue();
    transformer.setParameter(paramName, paramValue);
  }
}

4 lines before the faulty cast, that same parameter (featureSecureProcessing) is being retrieved from THE SAME parameters attribute that is later retrieved as String AS A BOOLEAN value.

So, if you want to quickly fix this, just check e.getValue() with instanceof an use the “getString()” method of the right class type to get the string value and that’s it.

Absolutely ridiculous OT, absolutely ridiculous. We will see how many months it takes to “fix” this (probably we’ll get the usual “upgrade to the latest version” just to throw the dice and see if it works -> Note: this “bug” was detected on 21.2, and on 23.4 the code is still exactly the same)

Opentext vs. Linux

I was testing DCTM 22.4 when I decided to shutdown the repository and oh, surprise!

dmadmin@aldago-desktop:~$ dm_shutdown_dctm224
Stopping Documentum server for repository: [dctm224]

Picked up JAVA_TOOL_OPTIONS: -Djava.locale.providers=COMPAT,SPI
Picked up JAVA_TOOL_OPTIONS: -Djava.locale.providers=COMPAT,SPI
/opt/documentum/dba/dm_shutdown_dctm224: 65: shopt: not found
/opt/documentum/dba/dm_shutdown_dctm224: 67: [[: not found
Picked up JAVA_TOOL_OPTIONS: -Djava.locale.providers=COMPAT,SPI

“shopt: not found”? “[[: not found”? What the… let’s check the dm_shutdown_dctm224 script:

#!/bin/sh
################## DOCUMENTUM SERVER SHUTDOWN FILE ######################

WHAT.THE.XXXX? You’re using shopt and [[ and you choose sh as shell which DOES NOT support it? With the amount of available shells that exist you have to pick the one that doesn’t support this? Well, another “bug” for the long history of “issues” on Linux systems… (by the way, on the new workflow designer, finally someone has decided to put Linux paths in the Linux distribution log files instead of “C:\…”, however someone also forgot the initial / before the path… maybe on next version)

Opentext Documentum is coming next month

I didn’t realize that roadmap documents were updated last month. It looks like the February release is still going to happen (and I’ve been told a definite date, so it looks it won’t be delayed). After reviewing them (haven’t seen any changes :D), I can say:

  • Not much features regarding CS, the pattern/usage visualization and the s3 support (which, as far as I know, can be already done without official support) are the new features.
  • No word about DFS, and I know for a fact that several customers have actively asked for updates to current libs and extended support for application servers.
  • Clients get barely any changes (D2, Webtop, xCP).

I’m curious to see how many bugs are found in this first release from Opentext (and the brave customers that go first into the unknown :D), considering that some of the experienced Documentum staff left the company and the changes to the development cycle (that I guess happen when you move from a hardware company to a software company)

Welcome to OpenText My Support

On August 25, 2017 at 8:00 P.M. EST, the Dell EMC Enterprise Content Division, including
Documentum, Application Xtender, Captiva, Kazeon, Document Sciences and LEAP will officially be integrated into OpenText Customer Support systems. While the customer service quality you’ve come to expect won’t change, the way you access support resources will.

On August 25, 2017 at 8:00 P.M. EST, OpenText’s online support portal known as My Support will become the primary system that you use to submit, update and monitor the progress of your organization’s support requests for former ECD technology products such as Application Xtender, Captiva, Documentum, Kazeon, Document Sciences and LEAP. In addition, you will be able to access support resources such as forums, documentation, knowledge base articles, account information and much more!
To help make the transition easier, we have migrated your existing tickets to My Support. Any new tickets you create with the My Support wizard will also be accessible there.

Support adventures II (REST Services edition)

I wasn’t expecting this week to be so “productive” 🙂

I had the chance to engage again with Documentum support, which is always an adventure 😉

We’re currently porting a framework from DFS to REST services. A couple of days ago, one of my colleagues had a problem and ask me if I knew why a query through the REST services was failing.

The query is using DATETOSTRING to retrieve a time field and returning a column with the attribute name using the alias, so, if a user requests expiration_date we execute a query like:

select DATETOSTRING(expiration_date,’dd/mm/yyyy’) as expiration_date from…

which returns the requested attribute in the desired (configurable) format. Easy, right? Well, the problem was that using r_creation_date or r_modify_date was throwing an exception:

 {    “status”: 400,
“code”: “E_INPUT_ILLEGAL_ARGUMENTS”,
“message”: “There are illegal arguments provided.”,
“details”: “Invalid format: \”12/04/2013\” is malformed at \”/04/2013\””}

We also observed that changing the alias to something else than r_creation_date/r_modify_date would work (but it was breaking our use case).

This was weird, because I was quite sure that those kind of queries worked previously, so I checked against the Content Server:

Connected to Documentum Server running Release 7.1.0210.0328  Linux64.Oracle
1> select DATETOSTRING(r_creation_date,’dd/mm/yyyy’) as r_creation_date from dm_document enable (return_top 1);
2> go
r_creation_date
—————
12/04/2013
(1 row affected)

As I though, it was working just fine. I suspected it had something to do with the custom date format, as EMC/OpenText tends to forget that not everyone uses ANSI or the american date format, so I decided to open a SR.

After some exchange of emails, we were told that both r_creation_date and r_modify_date where reserved words and that it was expected to fail. What???

After asking support to either fill this as a product limitation or fixing it, I decided to take a look at the code myself.

As I suspected, the query works just fine, and results are returned to the Query controller. The problem here is with the way REST generates the response. If you run the query with a different alias to see the result page you’ll get this:

"entries": [  {
"id": "http://127.0.0.1:8080/dctm-rest/repositories/repo.json?dql=select%20DATETOSTRING_LOCAL(r_creation_date,%27dd/mm/yyyy%27)%20as%20rr_creation_date%20from%20dm_document%20enable%20(return_top%201)&index=0",
"title": "12/04/2013",
"updated": "2017-06-16T11:04:38.284+00:00",
"published": "2017-06-16T11:04:38.284+00:00",
"content": {
"json-root": "query-result",
"definition": "http://127.0.0.1:8080/dctm-rest/repositories/repo/types/dm_document.json",
"properties": {
"rr_creation_date": "12/04/2013"
}}}]

Do you see those fields before the content element? That’s where everything breaks, why? Because of this:

public Date entryUpdated() {
Date updated = null;
Object modifyDate = ((QueryResultItem)getDataInternal())
  .getMandatoryAttribute("r_modify_date");
if (modifyDate == null) {
updated = new Date();
} else if ((modifyDate instanceof Date)) {
updated = (Date)modifyDate;
} else {
updated = DateFormatter.parse(modifyDate.toString());
}
return updated;
}

public Date entryPublished()
{
Date published = null;
QueryResultItem queryResultItem=getDataInternal();
Object modifyDate = ((QueryResultItem)getDataInternal())
  .getMandatoryAttribute("r_creation_date");
if (modifyDate == null) {
published = new Date();
} else if ((modifyDate instanceof Date)) {
published = (Date)modifyDate;
} else {
published = DateFormatter.parse(modifyDate.toString());
}
return published;
}

As you can see (besides the obvious copy/paste from one method to another changing the name of one variable), the standard view looks in the results from the query for a column with those names, and if it founds a column matching that name, tries to parse it with a forced format (which of course, it’s miserably failing with our custom date format).

This, in my opinion, is a bug, because the behaviour of the query functionality it is inconsistent between the Documentum stack, in fact, this query only fails with REST services, so I’ll keep pushing the SR to be treated as a bug and fixed by the talented team, and not as a “product limitation” or a “feature request”.

And if you face the same problem, and it is still not fixed, you have three options:

  • Keep waiting for a fix
  • Extend com.emc.documentum.rest.view.impl.QueryResultItemView and handle the IllegalArgumentException that throws DateFormatter.parse
  • Extend the controller adding a custom view for the results and removing/overriding the updated/published fields.