Enterprise Content Management

Armedia Blog

Create a Self-Extracting Installer in Linux

November 19th, 2014 by Paul Combs

Ultimately many would want to write complex RPMs to install software packages, however, those whom are accountable for writing such packages may agree that this task may be cumbersome and impractical. Where the RPM may accomplish the goal of wrapping an application in some sort of container for distribution, this goal is also possible by using a self-extracting archive or an installer which can launch an embedded script. This is where a utility called makeself comes in. Makeself is described as “a small shell script that generates a self-extractable tar.gz archive from a directory. The resulting file appears as a shell script (many of those have a .run suffix), and can be launched as is.

Install makeself

cd /opt/app
wget http://megastep.org/makeself/makeself-2.1.5.run
chmod 755 makeself-2.1.5.run
./makeself-2.1.5.run
cd makeself-2.1.5
cp *.sh /usr/bin

Example
Suppose you want to package and distribute a version of libevent2 for several CentOS 6 servers. Here is one way.

mkdir rpmfile
cd rpmfile

wget ftp://ftp.pbone.net/mirror/ftp5.gwdg.de/pub/opensuse/repositories/home:/aevseev/CentOS_CentOS-6/x86_64/libevent2-2.0.21-1.1.x86_64.rpm
wget ftp://ftp.pbone.net/mirror/ftp5.gwdg.de/pub/opensuse/repositories/home:/aevseev/CentOS_CentOS-6/x86_64/libevent-devel-2.0.21-1.1.x86_64.rpm

echo '#!/bin/bash
yum install -y libevent2-2.0.21-1.1.x86_64.rpm libevent-devel-2.0.21-1.1.x86_64.rpm' > libevent-install.sh
chmod +x libevent-install.sh
cd ..

Command

#makeself.sh ./DIRECTORY ./FINAL-PRODUCT-NAME.run "SFX installer for COMMENT HERE" ./FILE-IN-DIRECTORY-TO-RUN.sh
makeself.sh ./rpmfile ./libevent-devel.run "SFX installer for libevent-devel (2.0.21)" ./libevent-install.sh

Use the .run file

./libevent-devel.run

Source(s)
http://xmodulo.com/how-to-create-a-self-extracting-archive-or-installer-in-linux.html
http://megastep.org/makeself/

CMIS Integration – Integrating FileNet with SharePoint 2013

October 17th, 2014 by Ben Chevallereau

Recently, our team has been working on a series of CMIS Integrations. This video demonstrates the use of the CMIS components that we developed and used to integrate FileNet with SharePoint 2013. This integration has been packaged into SharePoint. During the video, you’ll see how to connect to FileNet, to browse the repository, to create folder, to create documents and as well to preview documents and to download documents.

Predictive Analytics and The Most Important Thing

October 15th, 2014 by Jim Nasr

It turns out I was wrong…which happens at an alarmingly increasing rate these days—though I chalk that to a thirst to challenge myself…errr, my story!

So, for a while now, I had convinced myself that I knew what the most important thing was about successfully doing predictive analytics: accuracy of data and the model (separating the noise). Veracity, as they say. In working with a few clients lately though, I no longer think that’s the case. Seems the most important thing is actually the first thing: What is the thing you want to know? The Question.

As technologist we often tend to over-complicate and possibly over-engineer. And it’s easy to make predictive analytics focus on the how; the myriad of ways to integrate large volumes and exotic varieties of data, the many statistical models to evaluate for fit, the integration of the technology components, the visualization techniques used to best surface results, etc. All of that has its place. But ultimately, first, and most importantly, we need to articulate the business problem and the question we want answers for.

What do we want to achieve from the analytics? How will the results help us make a decision?

Easy as that sounds, in practice it is not particularly easy to articulate the business question. It requires a real understanding of the business, its underlying operations, data and analytics and what would really move the meter. There is a need to marry the subject matter expert (say, the line of business owner) with a quant or a data scientist and facilitate the conversation. This is where we figure out the general shape and size of the result and why it would matter; also, what data (internal and external) feeds into it.

Articulating The Question engages the rest of the machinery. Answers are the outcome we care about. The process and the machinery (see below for how we do it) give us repeatability and ways to experiment with both asking questions and getting answers.

Armedia Predictive Analytics Process

Armedia Predictive Analytics process for getting from The Question to answers

VIDEO- Alfresco CMIS Integration- A Sneak Peak

September 19th, 2014 by Ben Chevallereau

For a few months now, myself and our fellow team members have been working on developing a CMIS integration to seamlessly allow for Alfresco to be accessed from other platforms. This video demonstrates the components that we built on top of the standard CMIS 1.1 and that we packaged in different platforms like Sharepoint 2013, Sharepoint 2010 or Drupal. Using these components, you can browse your repository, find your documents, upload documents by drag and drop, edit-online or use full text or advanced search. This video focus essentially on the integration with Alfresco, but it can be used with any CMIS 1.1 compliant repository. Complimentary to these components, we created as well a filtered search component. This one is only compatible with Alfresco but with any versions. Use this component, you can use a full text search and filter the result using metadata like file type, creator, creation date or file size.

These components have been built only with JS, HTML and CSS files, so it’s why it’s so easy to repackage in other web platforms. Moreover, we really built it to make them highly customizable. Depending of your use case, you can customize these components to display relevant metadata, to focus on a specific folder, to add new filter and a lot more.

 

 

For more information about our CMIS integration with Alfresco, Join us next week in San Francisco for Alfresco Summit 2014!

CLICK HERE to register for this event.

Spring Managed Alfresco Custom Activiti Java Delegates

September 17th, 2014 by Judy Hsu

I recently needed to make a change to have Alfresco 4’s Activiti call an object managed by Spring instead of a class that is called during execution.  Couple of reasons for this:

  1. A new enhancement was necessary to access a custom database table, so I needed to inject a DAO bean into the Activiti serviceTask.
  2. Refactoring of the code base was needed.  Having Spring manage the java delegate service task versus instantiating new objects for each process execution is always a better way to go, if the application is already Spring managed (which Alfresco is).
    • i.e. I needed access to the DAO bean and alfresco available spring beans.
    • NOTE:  You now have to make sure your class is thread safe though!

For a tutorial on Alfresco’s advanced workflows with Activiti, take a look at Jeff Pott’s tutorial here.  This blog will only discuss what was refactored to have Spring manage the Activiti engine java delegates.

I wanted to piggy-back off of the Activiti workflow engine that is already embedded in Alfresco 4, so decided not to define our own Activiti engine manually.  The Alfresco Summit 2013 had a great video tutorial, which helped immensely to refactor the “Old Method” to the “New Method”, described below.

Example:

For our example, we’ll use a simple activiti workflow that defines two service tasks, CherryJavaDelegate and ShoeJavaDelegate (The abstract AbstractCherryShoeDelegate is the parent).  The “Old Method” does NOT have spring managing the Activiti service task java delegates.  The “New Method” has spring manage and inject the Activiti service task java delegates, and also adds an enhancement for both service tasks to write to a database table.

Old Method

1. Notice that the cherryshoebpmn.xml example below is defining the serviceTask’s to use the “activiti:class” attribute; this will have activiti instantiate a new object for each process execution:

<process id="cherryshoeProcess" name="Cherry Shoe Process" isExecutable="true">
    ...
    <serviceTask id="cherryTask" name="Insert Cherry Task" activiti:class="com.cherryshoe.activiti.delegate.CherryJavaDelegate"></serviceTask>
    
    <serviceTask id="shoeTask" name="Insert Shoe Task" activiti:class="com.cherryshoe.activiti.delegate.ShoeJavaDelegate"></serviceTask>
    ...
</process>

2. Since we have multiple service tasks that need access to the same Activiti engine java delegate, we defined an abstract class that defined some of the functionality.  The specific concrete classes would provide / override any functionality not defined in the abstract class. 

...
import org.activiti.engine.delegate.JavaDelegate;
...
public abstract class AbstractCherryShoeDelegate implements JavaDelegate {
...
    @Override
    public void execute(DelegateExecution execution) throws Exception {
    ...
    }
...
}

public class CherryJavaDelegate extends AbstractCherryShoeDelegate {
...
...
}

New Method

Here’s a summary of all that had to happen to have Spring inject the java delegate Alfresco 4 custom Activiti service tasks (tested with Alfresco 4.1.5) and to write to database tables via injecting DAO beans.

  1. The abstract AbstractCherryShoeDelegate class extends Activiti engine’s BaseJavaDelegate
  2. There are class load order issues where custom spring beans will not get registered.  Set up depends-on relationship with the activitiBeanRegistry for the AbstractCherryShoeDelegate abstract parent
  3. The following must be kept intact:
    • In the Spring configuration file, 
      • Abstract AbstractCherryShoeDelegate class defines parent=”baseJavaDelegate” abstract=”true” depends-on=”ActivitiBeanRegistry”
      • For each concrete Java Delegate:
        • The concrete bean id MUST to match the class name, which in term matches the Activiti:delegateExpression on the bpmn20 configuration xml file 
          • NOTE: Reading this Alfresco forum looks like the activitiBeanRegistry registers the bean by classname, not by bean id, so likely this is not a requirement
        • The parent attribute MUST be defined as an attribute

Details Below:

1. Define spring beans for the abstract parent class AbstractCherryShoeDelegate and each concrete class that extends AbstractCherryShoeDelegate (i.e. CherryJavaDelegate and ShoeJavaDelegate). Have Spring manage the custom Activiti Java delegates where the concrete class.  The abstract parent must define it’s own parent as “baseJavaDelegate”, abstract=”true”, and depends-on=”ActivitiBeanRegistry”.

<bean id="AbstractCherryShoeDelegate" parent="baseJavaDelegate" abstract="true" depends-on="activitiBeanRegistry"></bean>
    
<bean id="CherryJavaDelegate"
class="com.cherryshoe.activiti.delegate.CherryJavaDelegate" parent="AbstractCherryShoeDelegate">
    <property name="cherryDao" ref="com.cherryshoe.database.dao.CherryDao"/>
</bean>

<bean id="ShoeJavaDelegate"
class="com.cherryshoe.activiti.delegate.ShoeJavaDelegate"  parent="AbstractCherryShoeDelegate">
    <property name="shoeDao" ref="com.cherryshoe.database.dao.ShoeDao"/>
</bean>

***NOTE: BELOW WILL NOT WORK

- Do NOT put any periods to denote package structure in the bean id!  Alfresco/Activiti got confused by the package “.”, where spring normally works fine with this construct.

- Also just because the concrete class is extending the parent abstract class, is not enough to make it work.

<bean id="com.cherryshoe.activiti.delegate.CherryJavaDelegate"
class="com.cherryshoe.activiti.delegate.CherryJavaDelegate" >
    <property name="cherryDao" ref="com.cherryshoe.database.dao.CherryDao"/>
</bean>

<bean id="com.cherryshoe.activiti.delegate.ShoeJavaDelegate"
class="com.cherryshoe.activiti.delegate.ShoeJavaDelegate" >
    <property name="shoeDao" ref="com.cherryshoe.database.dao.ShoeDao"/>
</bean>

2. Notice that the cherryshoebpmn.xml example below is using the “activiti:delegateExpression” attribute and referencing the Spring bean.  This means only one instance of that Java class is created for the serviceTask it is defined on, so the class must be implemented with thread-safety in mind:

<process id="cherryshoeProcess" name="Cherry Shoe Process" isExecutable="true">
    ...
    <serviceTask id="cherryTask" name="Insert Cherry Task" activiti:delegateExpression="${CherryJavaDelegate}"></serviceTask>

    <serviceTask id="shoeTask" name="Insert Shoe Task" activiti:delegateExpression="${ShoeJavaDelegate}"></serviceTask>
    ...
</process>

3.  The abstract class is now changed to extend the BaseJavaDelegate.  The specific concrete classes would provide / override any functionality not defined in the abstract class. 

...
import org.alfresco.repo.workflow.activiti.BaseJavaDelegate;
...
public abstract class AbstractCherryShoeDelegate extends BaseJavaDelegate {
...
    @Override
    public void execute(DelegateExecution execution) throws Exception {
    ...
    }
...
}

public class CherryJavaDelegate extends AbstractCherryShoeDelegate {
...
}

For more examples and ideas, I encourage you explore the links provided throughout this blog. Also take a look at Activiti’s user guide, particularly the Java Service Task Implementation section. What questions do you have about this post? Let me know in the comments section below, and I will answer each one.

The blog Spring Managed Alfresco Custom Activiti Java Delegates was originally posted on cherryshoe.blogspot.com.

U. S. Government Digital Acquisition Policy Gets an Update

September 11th, 2014 by Scott Roth

You may have seen the news that the U. S. Government has established the U.S. Digital Service, a small team designed to “to improve and simplify the digital experience that people and businesses have with their government.” On the heels of that announcement came the news that Michael Dickerson, former Google engineer, has been selected to head up the U. S. Digital Service. And, in conjunction with these announcements, came some initial updates to the U. S. Government’s acquisition policies as they relate to software and computing solutions. It is these updates I would like to highlight in this post.

These initial updates come in the form of two documents, The Digital Services Playbook and the TechFAR , which really go hand-in-hand. The Playbook lays out best practices for creating digital services in the government, and the TechFAR describes how these services can be acquired within the confines of existing acquisition policy (i.e., the FAR ). The Playbook discusses 13 “plays”, or best practices that should be implemented to ensure delivery of quality applications, websites, mobile apps, etc., that meet the needs of the people and government agencies.  Advocating and implementing these plays will be the Digital Services’ mission.  As a long-time provider of software development services, I wasn’t too surprised by any of these best practices – and neither will you. However, it was refreshing to see the government finally embrace and advocate them.  Here are the Digital Services Playbook plays.

  1. Understand what people need
  2. Address the whole experience, from start to finish
  3. Make it simple and intuitive
  4. Build the service using agile and iterative practices
  5. Structure budgets and contracts to support delivery
  6. Assign one leader and hold that person accountable
  7. Bring in experienced teams
  8. Choose a modern technology stack
  9. Deploy in a flexible hosting environment
  10.  Automate testing and deployments
  11. Manage security and privacy through reusable processes
  12. Use data to drive decisions
  13. Default to open

Like I said, you probably weren’t surprised by these practices, in fact, if you are a successful software services company, you probably already implement these practices. But remember, these practices are now being embraced by the U. S. Government, whose acquisition policy has traditionally been geared more toward building battleships than software solutions.

Speaking of acquisition, the TechFAR is a handbook that supplements the Federal Acquisition Regulations (FAR). The FAR is a strict and lengthy body of regulations all executive branch agencies must follow to acquire goods and services. The Handbook is a series of questions, answers, and examples designed to help the U. S. Government produce solicitations for digital services that embrace the 13 plays in the Digital Services Playbook. At first glance, you may not think that implementing these practices would require a supplement like the Handbook, but if you have any experience with the FAR, or agencies who follow it, you will understand that interpretation and implementation of the regulations varies from agency to agency, and they usually error on the side of caution (i.e., strict interpretation of the policy).

In my experience, the single most difficult thing for a U. S. Government agency to accomplish under the FAR is play #4, the use of agile methodologies to develop software solutions. If you can accomplish this, many of the other plays will happen naturally (e.g., #1, #2, #3, #6, #7, #10). However, the nature of agile development – user stories vs. full system requirements, heavy customer participation vs. just follow the project plan, etc. – seems contrary to the “big design” methodology implied by the FAR. This notion couldn’t be more wrong. The TechFAR encourages the use of agile methodologies and illustrates how solicitations and contracts can be structured to be more agile.

Personally, I think the Digital Services Playbook and the TechFAR are a great starting point for improving the quality and success of government software solutions.  And, official guidance like this now brings the U. S. Government’s acquisition process inline with how Armedia has always developed software solutions, i.e., using agile methodology.  No longer will we have to map our methodology and deliverables to an archaic waterfall methodology to satisfy FAR requirements.

I think the questions/answers/examples in the TechFAR are good, and provide terrific insight for both the government writing solicitations, and industry responding to them. If you sell digital services to the U. S. Government, I encourage you to read these two documents, the Digital Services Playbook and the TechFAR  — they’re not long. And even if you don’t contract with the U. S. Government, the best practices in the Playbook and the advice in the Handbook are still probably applicable to your business.

WordPress Contributors Upload Plugins

August 7th, 2014 by Paul Combs

My previous post, “Allow WordPress Users to Upload Images” discussed the use of the functions.php file to implement capabilities to the contributors role that isn’t there by original design. As the functions.php file is part of a WordPress theme, and if an alternate theme is selected, the functions will no longer be accessible unless the functions.php file is edited in that theme as well. With the use of plugins, however, the functions will remain.

This function is slightly different than many others, in that it is a persistant change. So even if the function is not enabled as a plugin or removed from a functions.php the change will remain until it is explicitly revoked. Two plugins are needed.

This plugin enables the capability of the contributor role to upload content along with their post. Once enabled the action takes effect, even if it is then disabled. Hence, the reason for the next plugin.

<?php
 /*
 Plugin Name: Armedia: Contributor Role Upload Enabler
 Description: Adds the capability to the contributor role to upload content. This change is persistent until it is explicitly revoked. Based on the source by Hardeep Asrani.
 Author: Paul Combs
 Version: 1.0
 Author URI: http://www.armedia.com
 */

function allow_contributor_uploads() {
 if ( current_user_can( 'contributor' ) && ! current_user_can( 'upload_files' )) {
 $contributor = get_role('contributor');
 $contributor->add_cap('upload_files');
 }
 }

add_action('admin_init', 'allow_contributor_uploads');
 ?>

This plugin removes the capability of the contributor role to upload content. Once enabled the action takes effect, even if it is then disabled.

<?php
 /*
 Plugin Name: Armedia: Contributor Role Upload Disabler
 Description: Removes the capability to the contributor role to upload content. This change is persistent until it is explicitly revoked.
 Author: Paul Combs
 Version: 1.0
 Author URI: http://www.armedia.com
 */

function remove_contributor_uploads() {
 if ( current_user_can( 'contributor' ) && current_user_can( 'upload_files' )) {
 $contributor = get_role('contributor');
 $contributor->remove_cap('upload_files');
 }
 }

add_action('admin_init', 'remove_contributor_uploads');

?>

There are no checks and balances here, so it should be noted that if both are enabled the results will not be as expected. A quick test of refreshing a contributor screen with both plugins enabled will reveal that the capability is available every other refresh. For the expected result, select one or the other.

Allow WordPress Contributors to Upload Images

August 5th, 2014 by Paul Combs

WordPress offers six different roles ranging from Super Admin to Subscriber. There is one role that permits a user to write and manage their own posts but cannot publish them; that is the contributor. Writing a post and submitting it for approval to publish without images is as easy as it gets. Posts of that nature are rare and can get a little boring. Images help make a post more interesting. However, as a contributor, images cannot be uploaded with the post. A number of work-a-rounds may be put into place to remedy this, but each can be time consuming and often repeated effort. This may not be so bad for a short post, but one with many images can be more challenging than the effort is worth.

To allow contributors to upload images with posts would greatly simplify this. One site offers a snippet of code to add to the theme’s functions.php file.

if ( current_user_can('contributor') && !current_user_can('upload_files') )
add_action('admin_init', 'allow_contributor_uploads');

function allow_contributor_uploads() {
$contributor = get_role('contributor');
$contributor->add_cap('upload_files');
}

This has been tested as working using WordPress 3.9.1. Here is an after and before screenshot of the admin board of a contributor. Notice the image on the left has a Media option.

contrib

A contributor may now upload their post with images ready for someone else to Publish. After a successful upload, from the Media option take note of a couple of differences between images submitted by the contributor and images submitted by others.

This is an image submitted by anyone other that the contributor. Notice that the contributor may only view the image and no other action may be taken.

image-not-contributor

This image has a box next to it to allow for bulk actions. The contributor may also Edit or Delete Permanently their own image as well as view it.

image-contributor

A second contributor account was created to verify that the another contributor may only view other contributor images as well as any other image and may perform other actions on own images. The results were as expected.

It is important to note that even if the code is removed from the functions.php file, the contributor role will still have the capability to upload content. This capability is persistent until explicitly revoked. The setting is saved to the database. To explicitly revoke this capability simply reverse the action editing the code above and append to the functions.php file.

if ( current_user_can('contributor') && current_user_can('upload_files') )
add_action('admin_init', 'remove_contributor_uploads');

function remove_contributor_uploads() {
$contributor = get_role('contributor');
$contributor->remove_cap('upload_files');
}

Although the functions.php may be modified with either of the pieces of code provided above, a cleaner and more portable method would be the use of custom plugins. One plugin to enable uploads and another to disable them. That could be the topic of my next article…

Source(s)
http://codex.wordpress.org/Roles_and_Capabilities
http://www.trickspanda.com/2014/01/allow-contributors-upload-images-wordpress/
http://codex.wordpress.org/Function_Reference/add_cap

How to Export Tabluar Data in Captiva 7

July 29th, 2014 by Scott Roth

Armedia has a customer using Captiva 7 to automatically capture tabular information from scanned documents. They wanted to export the tabular data to a CSV file to be analyzed in Excel. Capturing the tabular data in Captiva Desktop proved to be simple enough, the challenge was in exporting it in the desired format.  Our customer wanted each batch to create its own CSV file, and that file needed to contain a combination of fielded and tabular data expressed as comma delimited rows.

Here is an example of one of the scanned documents with the desired data elements highlighted.

Timecard-Scan-Metadata-Highlighted

Here is an example of the desired output.

EMPLOYEE,EID,DATE,REG HRS,OT HRS,TOT HRS
ANDREW MARSH,084224,4/22/2013,7,0,7
ANDREW MARSH,084224,4/23/2013,7.5,1,7.5
ANDREW MARSH,084224,4/24/2013,4,0,9
ANDREW MARSH,084224,4/25/2013,8.5,0,8.5
ANDREW MARSH,084224,4/26/2013,12,0,12
BARB ACKEW,084220,4/22/2013,7,0,7
BARB ACKEW,084220,4/23/2013,9.5,0,9.5
BARB ACKEW,084220,4/24/2013,9.5,0,9.5
BARB ACKEW,084220,4/25/2013,2.5,0,2.5
BARB ACKEW,084220,4/26/2013,8,.5,8

As you can see, the single fields of Employee Name and Employee Number are repeated on each row of the output.  However, because Employee Name and Employee Number were not captured as part of the tabular data on the document, this export format proved to be a challenge.

Here’s what I did:

  1. In the Document Type definition, I created fields for the values I wanted to capture and export (Name, EmployeeNbr, Date, RegHrs, OTHrs, TotHrs).  Here’s how it looks in the Document Type editor:

CC7-doctype

  1. In the Desktop configuration, I configured:
    • Output IA Values Destination: Desktop
    • Output dynamic Values: checked
    • Output Array Fields: Value Per Array Field
  2. Finally, I created a Standard Export profile that output the captured fields as a text file, not a CSV file. I named the file with a “CSV” extension so Excel could easily open it, but to create the required output format, the file had to be written as a text file.  Here is what the Text File export profile looks like:

CC7-export

The content of the Text file export profile is:

EMPLOYEE,EID,DATA,DATE,REG HRS, OT HRS, TOT HRS
---- Start repeat for each level 1 node ----
---- Start repeat for each row of table: Desktop:1.UimData.Hours ----
{S|Desktop:1.UimData.Name},{S|Desktop:1.UimData.EmployeeNbr},{S|Desktop:1.UimData.Date},{S|Desktop:1.UimData.RegHrs},{S|Desktop:1.UimData.OTHrs},{S|Desktop:1.UimData.TotHrs}
---- End repeat ----
---- End repeat ----

By using two nested loops I was able to access the non-tabular fields, Name and EmployeeNbr, as well as the tabular fields in the same output statement.  This looping feature of the Text File export profile saved having to write a CaptureFlow script to iterate through all the table variables and concatenate Strings for export.  A nice feature, but not well documented.

Good Times With VirtualBox Networking

July 24th, 2014 by David Milller

TL;DR version: if you run multiple VirtualBox VMs on the same desktop, setup 3 network interfaces on each such VM (one NAT, one internal, one bridged).

Now for the long, more entertaining (hopefully!) version:

Recently I switched from VMware Workstation to Oracle VirtualBox for my personal virtualization needs.   I’m very happy overall. VirtualBox seems faster to me – when I minimize a VM, do lots of other work, then restore the VM, it is responsive right away; where vmWare would page for a minute or two.  And each VirtualBox VM is in a separate host window, which I like more than VMware’s single tabbed window.

Still, I must say VMware’s networking was easier to deal with.  Here’s how I ended up with 3 IP addresses in each of my local VMs…

I have a CentOS VM running Alfresco and Oracle; a Fedora VM running Apache SOLR and IntelliJ IDEA; and a Windows 2012 Server VM running Active Directory.  I need connectivity to each of them from my host desktop (Windows 8.1), and they need connectivity to each other, and they need to be able to connect to Armedia’s corporate VMs.  Plus,  I’d rather not update my hosts file or IP settings every time  I move between the office and home!

1st VirtualBox network: a Network Address Translation (NAT network) which allows each VM to talk to the other VMs, but not to any other machine; and does not allow connection from the host desktop.  This meets Goal #2 (connectivity to each other).  But Goals #1 and #3 are not met yet.

2nd VirtualBox network: a VirtualBox Host-Only network which allows connectivity from the host desktop.  Now Goals #1 (connectivity from the host) and #2 (connectivity to each other) are just fine.

Also, both the NAT and the host-only network offer stable IP addresses; whether at home or at work, my VM’s get the same address each time, so I don’t spend 10 minutes updating IP references every time I switch location.

Danger!  Here is where VirtualBox tricks you!  It seems like Goal #3 (access to corporate VMs) is met too!  With the NAT and internal IP addresses, I can see our internal websites and copy smaller files to and from the data center VMs.  But if I transfer a larger file, I get a Connection Reset error!  Twice in the last month, I’ve spent hours tracking down the “defect” in the corporate network settings.  (You’d think I’d remember the problem the second time around; but in my defense the error manifested in different ways).

Solution?  Add the 3rd VirtualBox network: a bridged network (i.e. bridged to your physical network adapter, so this network causes each VM to have an IP address just like the host gets, from the corporate/home DHCP server): Now the 3rd goal is really met!  I can transfer files all day long, no worries.

Something to watch out for: when you disconnect a wired ethernet cable, VirtualBox automatically changes the bridged network to bind to your wireless interface.  This is nice since your VMs automatically get new addresses.  BUT! When you plug in the ethernet again (which in my case deactivates the wireless), VMware does NOT switch back to the wired interface!  That happened to me this morning.  Spent a few hours trying to figure out why my file uploads failed.  Finally saw where VirtualBox re-bound my bridged network.  Changed it back to the wired interface, and all was well.

Copyright © 2002–2011, Armedia. All Rights Reserved.