So my trainer last week showed off his Kindle to me. I must say I was more impressed than I expected to be. The demonstration turned into a discussion of the pro's and con's of owning one and why someone like me would be offended by the DRM included in the device. But, after thinking on his comments and looking at the features more myself, I am now wondering when I can afford one.
I read a lot of blogs on the subway. Soccer, programming, etc. I have even synced down technical documentation and manuals. My current process is to use Plucker ala the Sunrise application to sync those down to my old Treo. Code is not a very realistic option on my current device as it doesn't wrap well at all.
I'm cheap so I had concerns that owning a Kindle means I couldn't get all that nice free content down to the device. No problem. The Kindle has an email account you can send content to. If you have something in Word or text you can email it to the Kindle account and it will show up on the device. The device comes with wi-fi.
My current method works great but a nicer screen is so tantalizing. I want to read more. Unfortunately, I don't have the money to be buying books as often I would prefer. And, NYC doesn't have a nice chain of half-price books with good books on the shelves ala the likes of a Half-Price books in Austin. Alas, my choices are thin on the ground.
The Kindle screen is excellent for reading. It's hard to see the difference on a PC screen. If you know someone with one, just take a look at theirs.
I was surprised to see that they sold out on the Kindle in five and a half hours. Jeff Bezos apologized that there was still a six week delay. Interesting. As of today, the price is still sitting at $399. Ouch. Maybe at 300-350 this would be more attractive for my pocket book. However, demand will not allow for such a price reduction anytime soon. It will take competition to force a price reduction.
Maybe in 2009 things will look better.
Sunday, March 30, 2008
Friday, March 28, 2008
Documentum System Administration Training
I was pleased to wrap up my week of training with 2 days of Documentum System Administration training. This is the basic training. As you'll notice in my notes, there are some things that are only covered in the advanced class. Such is life.
This post will cover both days. I found myself taking fewer notes in this class as it was more interactive with fewer students. And the labs were more interesting. Also, the slides pretty much cover the details you need to know. As with Livelink administration, I noticed this is something you get trained for. But, you probably learn a lot more by doing and troubleshooting over time.
Again, these are mostly just raw notes. Not intended as entertaining literature :-) One thing about these notes is that this class covered Documentum version 6. There are a lot of comparisons to Livelink here as that's the world I come from and understand as it pertains to ECM. Of course, I've also tinkered with Alfresco but nothing serious (yet).
Day 1 of Administrator Training.
Add on services. Collaboration Services does chat, forums, portal, etc. Do we have this? Yes. It turns out to be so. After asking this question, I found out there are quite a bit of differences between eRoom and Collaboration Services. Not something I will go into here. But, suffice it to say that we will most likely be focusing on the latter possible.
LDAP integration. I noticed that Notes LDAP is not listed in the supported list of LDAP servers. Yes, we use Notes where I work. I know. I know.
During our lab, our Reassign user job wouldn't run today. We fixed it with this DQL query:
Then restart your Docbase.
Language packs are very easy to do. I can't believe we spent so much time worrying about this in the evaluation process.
Jobs - the Window interval is worth noting. Setting it to 60 means it can ONLY be run 60 minutes either side of it's scheduled time. If you want to enable a job to be able to be run anytime the number is weird. It should be 720 but it's not. it's 1440. I don't care what they say. That is an odd implementation of a job scheduler.
Day 2 of Administrator Training. Happy Friday.
We discussed distributed repositories. See distributed repositories reference. We didn't go too deep into content replication as that's in the advanced administrator course.
BOCS (Branch Office Caching Services) is a standalone ACS server. It's like a cache server. Not replicating (except maybe initial pre-pop). It lightweight. It's not ACS (you can't run jobs and stuff). Easy to administer, etc. It might be a good solution for remote sites. Requires web-based Documentum clients.
ACS/BOCS is read only by default? But you can make it writable. It's configurable.
Regarding failover, Documentum doesn't need to know about it. There's only one Content Server. With repository configurations, you have to tell each connection broker about the other servers using proximity values. If the server is prefixed with 0, it is used for data requests. And, servers prefixed with 9 (in the thousands position), it's used for content requests. And, there could be multiple of each. It goes to the closest one by number.
The proximity values use a thousands number: ####. But, you need to split that out as the first number and then the next 3. For example, 0100 is 0 and 100. And, 9001 is 9 and 001. So, a server for data requests with a proximity of 100 and a server for content requests with a proximity of 100, respectively.
Failover support - in this class, we are primarily talking about Content Server failover support not the application server failover support which is different.
Multiple repositories (docbases). You don't want just one usually as different groups within your organization may want their own. You can use objection replication. Not content replication like in remote location/caching scenarios. Instead, you are replicating objects. Important to keep track of which one is the original. If you try to check out an object, it must connect to the original and check that one out then you can edit the replica. If someone does something on the original server it doesn't update replicas immediately. It gets scheduled to replicate. See object replication jobs.
Interesting you can also configure it in such a way that you replicate objects manually. You could export it from one repository and burn to a CD. Example: movie studio, big big files. You don't want to push this stuff across the network.
When handling multiple repositories you need to have a repository federation in order to observe user/group/permissions across repositories as each repository has their own permissions and different sets of users.
One repository must be named as the governing repository. Superuser priv's not replicated. That's good. Setting up a federation is pretty easy. You create an object and define the governing repository and a job to do the replication.
For object replication each job is only representing a cabinet or a folder.
Storage areas are neat. You can actually specify whether or not to leave the extensions on. In Livelink this is not the case. Extensions are not included in LL.
Interesting choices on storage areas. I hope nobody asks for anything crazy. Blobs are always a bad idea. However, the Content Addressing option (forces you use one of their Centera devices) could be an option for some types of content, especially email. The reason is we need to archive that stuff. And, it's read only anyway. So, there's value there.
Records migration jobs. Interesting way to maintain your storage areas. I suppose you'd do this if disk space was more critical. I never had to do anything like this with Livelink, b/c I made sure I always had ample disk space. Keep it simple.
dm_ContentWarning - tell us if the drive's getting full. setup a job to do this..
dm_DMClean cleans up oprhaned files and aborter wf's, etc. Like an empty trash function.
dm_DMFilescan - looks at the filesystem then looks in the repository. That's different. This is more useful in a distributed environment where you are using content replication. So, you need both in a complex architecture.
see also dm_RenditionMgt and dm_VersionMgt and dmRemoveExpiredRetnObjects.
It's amazing how transparent they are about their system. I don't remember OpenText being this open about how everything works. Often it seemed in LL they had options in the UI that they told you never to touch without talking to support. The Documentum folks seem to take a different approach. They WANT you know what everything does as much as possible. Of course, they don't cover everything in this course. That's why they have an advanced admin course. Who knows. Maybe the OpenText culture has changed a little in the last 2 years?
Logging. Well, the logs aren't all in one place. There are lots of places to look. Reminiscent of my experience administering Oracle AS. That's what happens when you buy other software and try to merge it with your own. :-) But, it looks pretty reasonable. They are using Log4J. w00t. Wow. You can set filter DFC tracing to only look at a single user.
Consistency Checker. This is great. Livelink has something like this as well. Every system should have this!
Search. They used to use Verity before v5.3. Didn't scale as well. Now they use an Index Agent/Index Server package. Here we go. Index administration. More and more with Livelink administration, this became a large part of your responsibility. Search must ALWAYS be working and there are a lot of configuration options. It will be interesting to see how difficult this is to manage with Documentum.
Full text software has to be installed separately. Waa? Who wouldn't want their data full text indexed these days? Whatever. Must be for the small group option.
Wow. This Indexer reserves the next 4000 ports after the one you assign. Beastly. No wonder they insist you run it on a separate box from the Content Server.
When you run the installer for the indexer, one of the screens displays prerequisite information. And one of those items is that they do NOT support VM's for this. Thank GOD. I've had more troubles with VM's for certain software where I work than you might imagine. So, it will be nice to just use that screenshot. The thing is that VM's are not good enough to handle disk intensive systems like Index Servers.
Interesting architecture note: each Index Server uses a self-contained copy of Tomcat because it might be installed on a separate box.
By default, everything (dm_sysobject) is indexed. If you want to selectively manage that, you turn it off and explicitly name them. Livelink was like the latter. It will be interesting to see if there are any problem document types. Every now and again in Livelink, some documents would cause trouble. In fact, there were sometimes mystery documents that simply would not index. In which case, you could exclude them by node ID. Subtypes of a parent use their indexing property. If the parent is indexed, so are the children types.
There's also a self-paced 1 day course on index administration if we want to learn a little more about search. They are not free, but they are cheaper. "Online" means an online class like this one. "Self-paced" is different.
Upgrading
Run the Consistency Checker prior to upgrade. Man, that is right out of the Livelink book. Do I know that's true :-)
References to have
The Advanced class gets into performance tuning.
There is a System Administration certification. It's only a couple hundred bucks. You can take it multiple times if you need to. You have to pay again. WDK course is a 2 day course and might also be useful.
Overall. My training was quite good. The trainer gets good marks. I took these classes from MicroTek down in the Financial District.
That's it for my training. But, word is I may get some more soon. DFC. Look out.
This post will cover both days. I found myself taking fewer notes in this class as it was more interactive with fewer students. And the labs were more interesting. Also, the slides pretty much cover the details you need to know. As with Livelink administration, I noticed this is something you get trained for. But, you probably learn a lot more by doing and troubleshooting over time.
Again, these are mostly just raw notes. Not intended as entertaining literature :-) One thing about these notes is that this class covered Documentum version 6. There are a lot of comparisons to Livelink here as that's the world I come from and understand as it pertains to ECM. Of course, I've also tinkered with Alfresco but nothing serious (yet).
Day 1 of Administrator Training.
Add on services. Collaboration Services does chat, forums, portal, etc. Do we have this? Yes. It turns out to be so. After asking this question, I found out there are quite a bit of differences between eRoom and Collaboration Services. Not something I will go into here. But, suffice it to say that we will most likely be focusing on the latter possible.
LDAP integration. I noticed that Notes LDAP is not listed in the supported list of LDAP servers. Yes, we use Notes where I work. I know. I know.
During our lab, our Reassign user job wouldn't run today. We fixed it with this DQL query:
Then restart your Docbase.
Language packs are very easy to do. I can't believe we spent so much time worrying about this in the evaluation process.
Jobs - the Window interval is worth noting. Setting it to 60 means it can ONLY be run 60 minutes either side of it's scheduled time. If you want to enable a job to be able to be run anytime the number is weird. It should be 720 but it's not. it's 1440. I don't care what they say. That is an odd implementation of a job scheduler.
Day 2 of Administrator Training. Happy Friday.
We discussed distributed repositories. See distributed repositories reference. We didn't go too deep into content replication as that's in the advanced administrator course.
BOCS (Branch Office Caching Services) is a standalone ACS server. It's like a cache server. Not replicating (except maybe initial pre-pop). It lightweight. It's not ACS (you can't run jobs and stuff). Easy to administer, etc. It might be a good solution for remote sites. Requires web-based Documentum clients.
ACS/BOCS is read only by default? But you can make it writable. It's configurable.
Regarding failover, Documentum doesn't need to know about it. There's only one Content Server. With repository configurations, you have to tell each connection broker about the other servers using proximity values. If the server is prefixed with 0, it is used for data requests. And, servers prefixed with 9 (in the thousands position), it's used for content requests. And, there could be multiple of each. It goes to the closest one by number.
The proximity values use a thousands number: ####. But, you need to split that out as the first number and then the next 3. For example, 0100 is 0 and 100. And, 9001 is 9 and 001. So, a server for data requests with a proximity of 100 and a server for content requests with a proximity of 100, respectively.
Failover support - in this class, we are primarily talking about Content Server failover support not the application server failover support which is different.
Multiple repositories (docbases). You don't want just one usually as different groups within your organization may want their own. You can use objection replication. Not content replication like in remote location/caching scenarios. Instead, you are replicating objects. Important to keep track of which one is the original. If you try to check out an object, it must connect to the original and check that one out then you can edit the replica. If someone does something on the original server it doesn't update replicas immediately. It gets scheduled to replicate. See object replication jobs.
Interesting you can also configure it in such a way that you replicate objects manually. You could export it from one repository and burn to a CD. Example: movie studio, big big files. You don't want to push this stuff across the network.
When handling multiple repositories you need to have a repository federation in order to observe user/group/permissions across repositories as each repository has their own permissions and different sets of users.
One repository must be named as the governing repository. Superuser priv's not replicated. That's good. Setting up a federation is pretty easy. You create an object and define the governing repository and a job to do the replication.
For object replication each job is only representing a cabinet or a folder.
Storage areas are neat. You can actually specify whether or not to leave the extensions on. In Livelink this is not the case. Extensions are not included in LL.
Interesting choices on storage areas. I hope nobody asks for anything crazy. Blobs are always a bad idea. However, the Content Addressing option (forces you use one of their Centera devices) could be an option for some types of content, especially email. The reason is we need to archive that stuff. And, it's read only anyway. So, there's value there.
Records migration jobs. Interesting way to maintain your storage areas. I suppose you'd do this if disk space was more critical. I never had to do anything like this with Livelink, b/c I made sure I always had ample disk space. Keep it simple.
dm_ContentWarning - tell us if the drive's getting full. setup a job to do this..
dm_DMClean cleans up oprhaned files and aborter wf's, etc. Like an empty trash function.
dm_DMFilescan - looks at the filesystem then looks in the repository. That's different. This is more useful in a distributed environment where you are using content replication. So, you need both in a complex architecture.
see also dm_RenditionMgt and dm_VersionMgt and dmRemoveExpiredRetnObjects.
It's amazing how transparent they are about their system. I don't remember OpenText being this open about how everything works. Often it seemed in LL they had options in the UI that they told you never to touch without talking to support. The Documentum folks seem to take a different approach. They WANT you know what everything does as much as possible. Of course, they don't cover everything in this course. That's why they have an advanced admin course. Who knows. Maybe the OpenText culture has changed a little in the last 2 years?
Logging. Well, the logs aren't all in one place. There are lots of places to look. Reminiscent of my experience administering Oracle AS. That's what happens when you buy other software and try to merge it with your own. :-) But, it looks pretty reasonable. They are using Log4J. w00t. Wow. You can set filter DFC tracing to only look at a single user.
Consistency Checker. This is great. Livelink has something like this as well. Every system should have this!
Search. They used to use Verity before v5.3. Didn't scale as well. Now they use an Index Agent/Index Server package. Here we go. Index administration. More and more with Livelink administration, this became a large part of your responsibility. Search must ALWAYS be working and there are a lot of configuration options. It will be interesting to see how difficult this is to manage with Documentum.
Full text software has to be installed separately. Waa? Who wouldn't want their data full text indexed these days? Whatever. Must be for the small group option.
Wow. This Indexer reserves the next 4000 ports after the one you assign. Beastly. No wonder they insist you run it on a separate box from the Content Server.
When you run the installer for the indexer, one of the screens displays prerequisite information. And one of those items is that they do NOT support VM's for this. Thank GOD. I've had more troubles with VM's for certain software where I work than you might imagine. So, it will be nice to just use that screenshot. The thing is that VM's are not good enough to handle disk intensive systems like Index Servers.
Interesting architecture note: each Index Server uses a self-contained copy of Tomcat because it might be installed on a separate box.
By default, everything (dm_sysobject) is indexed. If you want to selectively manage that, you turn it off and explicitly name them. Livelink was like the latter. It will be interesting to see if there are any problem document types. Every now and again in Livelink, some documents would cause trouble. In fact, there were sometimes mystery documents that simply would not index. In which case, you could exclude them by node ID. Subtypes of a parent use their indexing property. If the parent is indexed, so are the children types.
There's also a self-paced 1 day course on index administration if we want to learn a little more about search. They are not free, but they are cheaper. "Online" means an online class like this one. "Self-paced" is different.
Upgrading
Run the Consistency Checker prior to upgrade. Man, that is right out of the Livelink book. Do I know that's true :-)
References to have
- Server Fundamentals Guide (the bible)
- Server Administration Guide
The Advanced class gets into performance tuning.
There is a System Administration certification. It's only a couple hundred bucks. You can take it multiple times if you need to. You have to pay again. WDK course is a 2 day course and might also be useful.
Overall. My training was quite good. The trainer gets good marks. I took these classes from MicroTek down in the Financial District.
That's it for my training. But, word is I may get some more soon. DFC. Look out.
Documentum Training, Day 3
This took a while for me to post but here it is. And finally my notes for the Tech Fundamentals training for Documentum, day 3. This was a 3 day course. This will be followed my notes from the 2 day System Administration course. These are mostly just raw notes. Not intended as entertaining literature :-) Again, there are a lot of comparisons to Livelink here as that's the world I come from and understand as it pertains to ECM. Of course, I've also tinkered with Alfresco but nothing serious (yet).
Yumm. More free coffee.
Day 3
DocApps are self contained packages of repository objects. It used to be you had to write everything down that you did in QA then carefully moving everything to another repository (production for example). Now you have DocApp archives to handle migration. Excellent. This is something that was always missing from Livelink. Moving a form template in Livelink from QA to production is (still is?) a manual, scary process.
Use Document Application Builder (DAB) to create objects, lifecycles and other things. If you are writing your own application with WDK, then these objects would need to be created with DAB first. In version 6 this can be done in Composer. Not all features are included in Composer (yet). The new Composer tool is in Eclipse so we're all working in one tool. WDK is also in Eclipse but is a separate tool for programming and customization.
As I see it, aliases are sort of like a variable. Sounds like a parameter to me. Either way, it's a dynamic placeholder in a bigger process. Aliases go inside alias sets. And, they can be affected programmatically using DQL or DFC. Nice. These are basically just like Maps. That's interesting. Livelink has a similar dynamic variable ability within workflows. But, I think that's about it. Aliases in Documentum can be used across object types.
Pointer to documentation: Server fundamentals reference. Good for looking up how the server rules work.
I am really pleased with custom types in Documentum. The fact that when you create a "custom type" you can have dynamic attributes is a blessing. I am muy impressed. I could not do that with Livelink attributes out of the box 2 years ago.
Finally. Workflow. In my experience, this is where the ECM's bread is buttered. Can it hold up the Livelink workflow engine?
Workflow packages. Documentum supports multiple packages and a choice of mandatory or optional (using ProcessBuilder, BPM). Otherwise, it's mandatory packages. The built-in workflow package is content-centric. I liked the trainer's Star Wars reference for mandatory packages. If you don't have ProcessBuilder installed, just treat it as "this is not the document you are looking for" :-) I'm pretty certain we will use ProcessBuilder. I can't see a reason not to. I'd much rather have the extra options it offers rather than just the built-in engine.
Workflow forms uses XForms. w00t. And, it's part of the Application Builder. Thankfully not as "weird" as Livelink's. However, we don't get to do any forms here. That's another class. Hopefully my group can figure this out with taking the class.
Another thing I want to figure out is workflow reporting in Documentum. In Livelink, that was also weird and needed improvement. In Livelink, if you had 1000 open workflow instances, reporting against them was very hard to do. I like the availability of expressions making dynamic workflow designs easy in Documentum. However, I can see why they have a 4 day business process class. The reason they have all this power is the acquisition of the ProcessBuilder. It's truly full featured like other BPM's. Has Livelink changed to a BPM yet? Hmm..Will have to ask around.
Finally, we finished up with workflow. I love it (so far).
Then we did lifecycle. OK. Simple enough.
Then we did presets...easy. and useful.
That's all for day 3.
One more post on this will be my from sys admin class.
Yumm. More free coffee.
Day 3
DocApps are self contained packages of repository objects. It used to be you had to write everything down that you did in QA then carefully moving everything to another repository (production for example). Now you have DocApp archives to handle migration. Excellent. This is something that was always missing from Livelink. Moving a form template in Livelink from QA to production is (still is?) a manual, scary process.
Use Document Application Builder (DAB) to create objects, lifecycles and other things. If you are writing your own application with WDK, then these objects would need to be created with DAB first. In version 6 this can be done in Composer. Not all features are included in Composer (yet). The new Composer tool is in Eclipse so we're all working in one tool. WDK is also in Eclipse but is a separate tool for programming and customization.
As I see it, aliases are sort of like a variable. Sounds like a parameter to me. Either way, it's a dynamic placeholder in a bigger process. Aliases go inside alias sets. And, they can be affected programmatically using DQL or DFC. Nice. These are basically just like Maps. That's interesting. Livelink has a similar dynamic variable ability within workflows. But, I think that's about it. Aliases in Documentum can be used across object types.
Pointer to documentation: Server fundamentals reference. Good for looking up how the server rules work.
I am really pleased with custom types in Documentum. The fact that when you create a "custom type" you can have dynamic attributes is a blessing. I am muy impressed. I could not do that with Livelink attributes out of the box 2 years ago.
Finally. Workflow. In my experience, this is where the ECM's bread is buttered. Can it hold up the Livelink workflow engine?
Workflow packages. Documentum supports multiple packages and a choice of mandatory or optional (using ProcessBuilder, BPM). Otherwise, it's mandatory packages. The built-in workflow package is content-centric. I liked the trainer's Star Wars reference for mandatory packages. If you don't have ProcessBuilder installed, just treat it as "this is not the document you are looking for" :-) I'm pretty certain we will use ProcessBuilder. I can't see a reason not to. I'd much rather have the extra options it offers rather than just the built-in engine.
Workflow forms uses XForms. w00t. And, it's part of the Application Builder. Thankfully not as "weird" as Livelink's. However, we don't get to do any forms here. That's another class. Hopefully my group can figure this out with taking the class.
Another thing I want to figure out is workflow reporting in Documentum. In Livelink, that was also weird and needed improvement. In Livelink, if you had 1000 open workflow instances, reporting against them was very hard to do. I like the availability of expressions making dynamic workflow designs easy in Documentum. However, I can see why they have a 4 day business process class. The reason they have all this power is the acquisition of the ProcessBuilder. It's truly full featured like other BPM's. Has Livelink changed to a BPM yet? Hmm..Will have to ask around.
Finally, we finished up with workflow. I love it (so far).
Then we did lifecycle. OK. Simple enough.
Then we did presets...easy. and useful.
That's all for day 3.
One more post on this will be my from sys admin class.
Documentum Training, Day 2
I'm a little late, but here are my notes for the Tech Fundamentals training for Documentum, days 2. It's a 3 day course. This will be followed my notes from the 2 day System Administration course. These are mostly just raw notes. Not intended as entertaining literature :-) Again, there are a lot of comparisons to Livelink here as that's the world I come from and understand as it pertains to ECM. Of course, I've also tinkered with Alfresco but nothing serious (yet).
Yumm. More free coffee.
There interesting separation of "client capability" versus privileges (server side).
Privileges are summed using powers of 2. I think this is the same type of math that OpenText Livelink uses for permissions. I remember working with my colleague a time or 2 to write reports that would scan our folder taxonomy trying to figure out how to report on permissions. It wasn't easy. So, the concept goes something like this:
create type (1) + create cabinet (2) = 3 in db.
Distinction: sysadmin and superuser. superuser is most powerful, can delegate priv's to sysadmin. sysadmin is similar to a livelink admin, but perhaps a little more powerful?
You can get to the DQL editor (if you have permission) by holding down Control on your keyboard and selecting File | About webtop. But, what's more interesting is that they also included a API tester. Wow..
Groups seem much more robust than Livelink. There are more options. In Livelink, have the concept of roles in project workspaces only. In Livelink's Enterprise workspace, the concept is permission based only (without customization). In Documentum, you can use roles pretty much everywhere. And, the delegation of group membership and such looks better than what Livelink offers with it's Group Leader feature. But wait. Roles don't have any special meaning with the out of the box Webtop application. You'd have to customize Webtop to look at it as a role and do something with that. Otherwise, it's really just treated like a group. This gives you a way to separate groups out from roles as a programmer. Furthermore, this gives you something over and above the default built-in client capabilities. Otherwise Webtop just treats it like a normal group.
Dynamic groups. Fascinating concept. Potential members of a group. If an application can detect where the user is connecting from, the application can say "this user is a member of this group" otherwise no access. You MUST already be a potential member of the group. And it must be a custom application. Out of the box Webtop isn't going to take advantage of this. It makes it easier for your custom application to take advantage of this requirement by setting up dynamic groups.
Permissions sets. The key point here is that ACL's are templates and are reusable. That is VERY different from Livelink. Let me break it down so my brain can get it.
The overview of search in class seems adequate. It doesn't sound like they have an equivalent to the Livelink query language. In order words, their advanced search has some neat-o form based thinger. And, you can save searches, etc. However, I do see where you could explicitly define a search expression against the index. Maybe I just need to look at it more. Also, our instances didn't include a full text index of the Docbase.
I asked about WebPublisher. Per trainer, this tool is geared towards the non-html folks.
Example: press releases, use template(s). marketing folks fill it out..
Webpublisher has built in review/approval process and the content gets pushed out/expired at the right time. There's a 3 day course on WebPublisher.
In short, WebPublisher templates are usually represented as XML and are transformed. Of course, they can be pages that access a Docbase from front to back with no replication at all. It's flexible. It can also be done with static HTML but will not be as flexible and will look a lot like the way it does in WebPublisher. I asked about skill level. It would not typically take a programmer to do the templates. You just need knowledge of HTML and of course a little XML. So, an analyst could do this. If the template is complex enough, you might need someone with a little more skill. That's probably the most info you're gonna get without training or reading the docs.
Yumm. More free coffee.
There interesting separation of "client capability" versus privileges (server side).
Privileges are summed using powers of 2. I think this is the same type of math that OpenText Livelink uses for permissions. I remember working with my colleague a time or 2 to write reports that would scan our folder taxonomy trying to figure out how to report on permissions. It wasn't easy. So, the concept goes something like this:
create type (1) + create cabinet (2) = 3 in db.
Distinction: sysadmin and superuser. superuser is most powerful, can delegate priv's to sysadmin. sysadmin is similar to a livelink admin, but perhaps a little more powerful?
You can get to the DQL editor (if you have permission) by holding down Control on your keyboard and selecting File | About webtop. But, what's more interesting is that they also included a API tester. Wow..
Groups seem much more robust than Livelink. There are more options. In Livelink, have the concept of roles in project workspaces only. In Livelink's Enterprise workspace, the concept is permission based only (without customization). In Documentum, you can use roles pretty much everywhere. And, the delegation of group membership and such looks better than what Livelink offers with it's Group Leader feature. But wait. Roles don't have any special meaning with the out of the box Webtop application. You'd have to customize Webtop to look at it as a role and do something with that. Otherwise, it's really just treated like a group. This gives you a way to separate groups out from roles as a programmer. Furthermore, this gives you something over and above the default built-in client capabilities. Otherwise Webtop just treats it like a normal group.
Dynamic groups. Fascinating concept. Potential members of a group. If an application can detect where the user is connecting from, the application can say "this user is a member of this group" otherwise no access. You MUST already be a potential member of the group. And it must be a custom application. Out of the box Webtop isn't going to take advantage of this. It makes it easier for your custom application to take advantage of this requirement by setting up dynamic groups.
Permissions sets. The key point here is that ACL's are templates and are reusable. That is VERY different from Livelink. Let me break it down so my brain can get it.
- It's like a permissions set is a template that you apply to different objects where Livelink's ACL's are applied to objects but are not reusable.
- There are system permission sets and user perm sets.
- If you modify the permission set on on a folder, it doesn't affect or restrict a document inside unless you propagate that down. and it doesn't seem that you can do that with the UI. I think you'd have to do that with DQL? If so, that's a weakness when compared to LL as LL has a cool interface to propagate permission changes to children. However, Livelink would tell you to never modify the database. So, the DQL option wins out for flexibility.
- You can apply permissions to a specific object. These are custom permissions sets. This is more what Livelink is. But it's not going to be a best practice. If you modify the properties on an object, you can modify the permissions, for a temporary use as an example. It will create a "custom pemmission set". It will be a throwaway and create some sort of name like "dm_4500000abcs", etc.
The overview of search in class seems adequate. It doesn't sound like they have an equivalent to the Livelink query language. In order words, their advanced search has some neat-o form based thinger. And, you can save searches, etc. However, I do see where you could explicitly define a search expression against the index. Maybe I just need to look at it more. Also, our instances didn't include a full text index of the Docbase.
I asked about WebPublisher. Per trainer, this tool is geared towards the non-html folks.
Example: press releases, use template(s). marketing folks fill it out..
Webpublisher has built in review/approval process and the content gets pushed out/expired at the right time. There's a 3 day course on WebPublisher.
In short, WebPublisher templates are usually represented as XML and are transformed. Of course, they can be pages that access a Docbase from front to back with no replication at all. It's flexible. It can also be done with static HTML but will not be as flexible and will look a lot like the way it does in WebPublisher. I asked about skill level. It would not typically take a programmer to do the templates. You just need knowledge of HTML and of course a little XML. So, an analyst could do this. If the template is complex enough, you might need someone with a little more skill. That's probably the most info you're gonna get without training or reading the docs.
Tuesday, March 25, 2008
PownceFS efforts
I'm trying to get Richard Crowley's PownceFS to work.
I am at the point where I'm asking for help now.
I'm using Ubuntu gutsy. I'm sooo close.
I added svn, python-fuse and python-json as it was complaining about those.
Now I got a permission error. I tried using sudo and got it to run on /mnt withtout the permission error. Now that folder is kind of screwed. I've been using Linux for a while, but this mountpoint stuff is throwing me off.
Here's a sample. Any ubuntu folks have comments?
UPDATE:
All fixed thanks to help from others. Get rid of the "/" and let it create regular folders. And make sure it can access fuse and fusermount.
I am at the point where I'm asking for help now.
I'm using Ubuntu gutsy. I'm sooo close.
I added svn, python-fuse and python-json as it was complaining about those.
Now I got a permission error. I tried using sudo and got it to run on /mnt withtout the permission error. Now that folder is kind of screwed. I've been using Linux for a while, but this mountpoint stuff is throwing me off.
Here's a sample. Any ubuntu folks have comments?
UPDATE:
All fixed thanks to help from others. Get rid of the "/" and let it create regular folders. And make sure it can access fuse and fusermount.
Monday, March 24, 2008
Documentum Training, Day 1
As twittered earlier today, I learned today that that Documentum's metaphors are not that different from those of OpenText Livelink. That's good. My brain still has a LOT of Livelink info on it and it's good to see it not wasted.
It seems everything in Documentum is an object, and it is all object oriented. That means new objects can be created using existing ones. Same concept in Livelink. In Documentum, you have cabinets. Livelink doesn't have that but it's probably closer to a Livelink workspace object or maybe even a domain. It' just a top level folder object.
I'm happy to see that they do have compound documents. But, they are called virtual folders in Documentum. Otherwise, it's the same concept.
Reports are possible. One of the things I REALLY liked about Livelink was the LiveReport object. Documentum has this ability. It might even be a bit better, because they abstract away the query language a bit with their DQL (Documentum Query Language). I'm sure I will have more to say on this in the next few days. What I'm wanting to know is can report results take on interesting results like graphs? With LiveReports you could use a handy little set of graphs to show things like a graph of the ten largest file types or the top 10 usage by department, etc. If it's not out of the box, I'm sure some 3rd party has a module for this.
It's interesting that a lifecycle option seems to be there out of the box with Documentum. As of 2005, I don't think this was out of the box with document management in LL.
Regarding version control, that seems to work OK. I was impressed with the idea of branching versions. If your document was at version 2.0 and you went back and checked-out version 1.1 and versioned it again, it branches it. Weird. I don't know that Livelink does this? Anyone know that for sure?
And what about modules? Livelink is all about modules. Or at least it used to be as of 2005. And it used to be all in OScript until they introduced Java modules. Documentum is more about "applications". I'm not in developer training but the "Intro" training then sysadmin training later this week. So, I probably will not get to the bottom of this until a bit later.
When I first lay eyes on "webtop" back in 2005, I was very disappointed. After today, I'm a bit impressed with version 6. I say "a bit", because I still believe that you should not mimic a file explorer inside a browser is not what we should be shooting for. That's probably why most organizations customize it and make their own Documentum application.
There's still a lot to be evaluated this week. Workflow and search are big pieces to the puzzle. Livelink's search engine was top notch and their workflow wasn't bad either. We'll see how Documentum measures up here.
It seems everything in Documentum is an object, and it is all object oriented. That means new objects can be created using existing ones. Same concept in Livelink. In Documentum, you have cabinets. Livelink doesn't have that but it's probably closer to a Livelink workspace object or maybe even a domain. It' just a top level folder object.
I'm happy to see that they do have compound documents. But, they are called virtual folders in Documentum. Otherwise, it's the same concept.
Reports are possible. One of the things I REALLY liked about Livelink was the LiveReport object. Documentum has this ability. It might even be a bit better, because they abstract away the query language a bit with their DQL (Documentum Query Language). I'm sure I will have more to say on this in the next few days. What I'm wanting to know is can report results take on interesting results like graphs? With LiveReports you could use a handy little set of graphs to show things like a graph of the ten largest file types or the top 10 usage by department, etc. If it's not out of the box, I'm sure some 3rd party has a module for this.
It's interesting that a lifecycle option seems to be there out of the box with Documentum. As of 2005, I don't think this was out of the box with document management in LL.
Regarding version control, that seems to work OK. I was impressed with the idea of branching versions. If your document was at version 2.0 and you went back and checked-out version 1.1 and versioned it again, it branches it. Weird. I don't know that Livelink does this? Anyone know that for sure?
And what about modules? Livelink is all about modules. Or at least it used to be as of 2005. And it used to be all in OScript until they introduced Java modules. Documentum is more about "applications". I'm not in developer training but the "Intro" training then sysadmin training later this week. So, I probably will not get to the bottom of this until a bit later.
When I first lay eyes on "webtop" back in 2005, I was very disappointed. After today, I'm a bit impressed with version 6. I say "a bit", because I still believe that you should not mimic a file explorer inside a browser is not what we should be shooting for. That's probably why most organizations customize it and make their own Documentum application.
There's still a lot to be evaluated this week. Workflow and search are big pieces to the puzzle. Livelink's search engine was top notch and their workflow wasn't bad either. We'll see how Documentum measures up here.
Sunday, March 23, 2008
Caszzzz out
I have Documentum training all week this week. You know what that means. Jeans! Oh yeah.
I already have several years of OpenText Livelink experience. It will be interesting to see how the architecture, API and other stuff compares. I have already played around with Alfresco in the past. This should round out my ECM knowledge pretty well. Since most employers have multiple ECM's or repositories in their organization, it doesn't hurt my resume at all :-)
I already have several years of OpenText Livelink experience. It will be interesting to see how the architecture, API and other stuff compares. I have already played around with Alfresco in the past. This should round out my ECM knowledge pretty well. Since most employers have multiple ECM's or repositories in their organization, it doesn't hurt my resume at all :-)
Sunday, March 16, 2008
Great Ideas : AirBed and Breakfast
I had a lot of fun attending barcampnyc3 yesterday. I didn't present after all. Based on what others were doing and the vibe I got I just didn't feel my presentation was of particular interest. Plus, I think I need to come out of my shell a little more. More on barcamp experience later.
As I review my notes I am amazed how many new social sites are out there. This has led me to some other great ideas. Look out.
Here's a great idea. It's called AirBed&Breakfast. It was launched on March 3 and was useful in providing beds for SXSW attendees. I was just talking to the wife about a similar concept last night. I like this concept! I wish I had thought of it first! Check it out. You can do 2 things here. First, you have to create a profile. Then, you can choose to offer up your couch (or airbed) and optionally provide some form of breakfast and maybe even a little welcome to town local information. Going out of town for a conference or something? You can do the same thing. I can see this working for other events. Hmmm. GenCon maybe? Be your own hostel I guess.
My profile
As I review my notes I am amazed how many new social sites are out there. This has led me to some other great ideas. Look out.
Here's a great idea. It's called AirBed&Breakfast. It was launched on March 3 and was useful in providing beds for SXSW attendees. I was just talking to the wife about a similar concept last night. I like this concept! I wish I had thought of it first! Check it out. You can do 2 things here. First, you have to create a profile. Then, you can choose to offer up your couch (or airbed) and optionally provide some form of breakfast and maybe even a little welcome to town local information. Going out of town for a conference or something? You can do the same thing. I can see this working for other events. Hmmm. GenCon maybe? Be your own hostel I guess.
My profile
Saturday, March 15, 2008
Pownce stuff
Last night I got started on a Java API that will talk to the Pownce API 2.0. I haven't seen anyone else doing it yet. I'm sure someone else will try. Anyway, I'm thinking of using this to go into a plug-in for something else.
I've got some simple stuff done that queries your account and your friends. That was pretty easy. But, the authentication has a bug in it. Hopefully, Leah will get it fixed soon.
I'm heading off to barcamp here in about an hour. This should be fun.
I've got some simple stuff done that queries your account and your friends. That was pretty easy. But, the authentication has a bug in it. Hopefully, Leah will get it fixed soon.
I'm heading off to barcamp here in about an hour. This should be fun.
Tuesday, March 11, 2008
Minicards have arrived!
I am ecstatic. I received my shipment of minicards in today.
Admittedly, I stole the idea from Andrew Hyde of StartupWeekend fame. But, hey that's OK. He and I are not competing with each other.
These "minicards" are cheap, only $20. Try some from moo.com. Be warned. They ship from London so that increases the price and time for shipment a bit. It took about 2 weeks for me to get mine.
You can see how nice the photo came out. The text on the back is easy to read but you wouldn't know it from my lousy camera. You can get a better idea of what text might look like idea from Andrew's post link above. Just click through to Flickr and hit All Sizes.
Admittedly, I stole the idea from Andrew Hyde of StartupWeekend fame. But, hey that's OK. He and I are not competing with each other.
These "minicards" are cheap, only $20. Try some from moo.com. Be warned. They ship from London so that increases the price and time for shipment a bit. It took about 2 weeks for me to get mine.
You can see how nice the photo came out. The text on the back is easy to read but you wouldn't know it from my lousy camera. You can get a better idea of what text might look like idea from Andrew's post link above. Just click through to Flickr and hit All Sizes.
Subscribe to:
Posts (Atom)