Thursday, July 28, 2011

Changing color scheme and font settings in gvim




I like using vim for quickly loading a file for view/edit etc on my windows m/c. 
At work, with the monitors being 21" in size.. I wanted vim to open up on a larger frame, and also wanted to have some tweaks incorporated (font, color scheme, tab width, etc). I thought I'd simply have to edit my .vimrc from my home directory and voila.. everything should work. Well.. not quite. It works fine for my command line vi invocation from cygwin, but not when I open up from vim. After googling a bit, found few suggestions which satisfied my needs. All you have to do is to edit the _vimrc file which is located in you vim installation folder. For instance, C:\Program Files\Vim 
You can either edit it directly, or open up vim editor and goto Edit -> Startup Settings. Once opened, you can add the settings to the file. In my case, I added the following: 

set tabstop=4
set number
set lines=50
set columns=120

If you need to change the color scheme, chose the color scheme that you liked, and add the following to the _vimrc file:

:color morning

For font settings, choose the font that you like, then type 
:set guifont?
This will display the current font setting information. Note that down and set it to guifont in _vimrc.
Here's mine:

if has ('gui_running') 
 set guifont=Bitstream_Vera_Sans_Mono:h10:cANSI
endif

Happy vimming!! And thanks for those folks who put out this nice little tidbit about vim settings.. 

Wednesday, July 6, 2011

JSON Hiccups

This is my first entry into the JSON world and everything looked pretty straight-forward from the outset.
When the time came to dig in and use JSON and map it back to POJO's, issues started sprouting, but nothing too daunting or complicated.

Issue 1: I had to call a RESTful webservice which responds with application/json type. I dont have the JSON schema available and the JSON object is quite large (requiring quite a few classes and fields). Now I needed to generate the java classes, but couldnt figure out an easy way (like how you would generate classes using XML schema). I could lookup the json string and then create the Java classes manually, which is what I did for the simpler json responses..but for the larger responses, it didnt seem to be a smart way to go about. After some googling, found this wonderful link http://jsongen.byingtondesign.com/ which generates the java files for you if you could provide the url for the json request! What a time-saver?

Issue 2:  This time, I was getting a json string which didnt follow the standard syntax. All the property names were capitalized and some properties were just named '$'. While reading about the json mapping, I understood that badgerfish, when creating json string from xml schema generates '$' as the property name to represent the text between the xml element. ex:
XML: <customer>Arnold</customer> is represented as
JSON: {"$": "Arnold"}
But due to the capitalized property names, when I tried to use ObjectMapper.readValue(json, Class)...it fails due to the fact that its looking for a property with lowercase name. To overcome that, I then had to use @JsonProperty annotation on each property in java file to match the property name in json and everything went smooth after that.  

Issue 3: Some of the generated java files didnt include all the property names that were found in the json string. So when I tried to map the values from json into java objects, it failed with an org.codehaus.jackson.map.exc.UnrecognizedPropertyException. If you need that property, you'd have to add that in your generated java file. But assuming you dont need that property and you dont want to keep adding those fields which is not needed by your app, the simplest way is to disable the FAIL_ON_UNKNOWN_PROPERTIES feature in Jackson's deserialization config by calling:

ObjectMapper.configure(DeserializationConfig.Feature.FAIL_ON_UNKNOWN_PROPERTIES, false);

Monday, April 18, 2011

Maven Token Replacement

I've been trying to create a maven pom file that would do the same stuff that the current ant build file does. In the process, I've been able to learn some good stuff about Maven.
One thing I had to accommodate is the directory structure that we have, which unfortunately is not the same as maven's default. Once I got the basic stuff working, I wanted to try couple of things:

* Add profiles, so I can build the app for different environments
* Add tokens to files that can replaced during build process

To setup the profiles, I just had to add something like this:
<profiles>
  <profile>
    <id>dev</id>
    <activation>
      <activebydefault>true</activebydefault>
    </activation>
    <properties>
      <env.properties>dev.properties</env.properties>
    </properties>
  </profile>

  <profile>
    <id>sso</id>
    <properties>
      <env.properties>sso.properties</env.properties>

    </properties>
  </profile>
</profiles>

This can be part of the pom itself or it could be saved into a file called profiles.xml

To get the token replacement working, I ran into some trouble. First I tried using filters which is specifically meant for this purpose, or so I thought.

Add the filter:
<filters>
  <filter>dev.properties</filter>
</filters>


Enable the filter (w/o this, filtering wont work):
<resource>
  <filtering>true</filtering>
  <directory>src/main/webapp/WEB-INF</directory>
  <includes><include>web.xml</include></includes>
</resource>

This did replace the token specified in web.xml file, but filtering actually just copies the included resources to WEB-INF/classes after replacing the token. So filters were out and had to look for another option. Then I ran into maven-replacer-plugin which does exactly what I need to do.

<plugin>
  <groupid>com.google.code.maven-replacer-plugin</groupid>
  <artifactid>maven-replacer-plugin</artifactid>
  <version>1.3.5</version>
  <executions>
    <execution>
    <id>replaceAuth</id>
    <phase><b>package</b></phase>
    <goals>
      <goal>replace</goal>
    </goals>
    </execution>
  </executions>
  <configuration>
    <file>target/simple-webapp/WEB-INF/web.xml</file>
    <replacements>
      <replacement>
          <token>AUTH_METHOD</token>
          <value>${auth.method}</value>
      </replacement>
    </replacements>
  </configuration>
</plugin>

I first tried the 'prepare-pacakage' phase as mentioned in some forums, but the file that I need to run replacement on wasnt copied over to the target directory at that point. So the maven build fails. I had to use the 'package' phase to apply the token replacement.

But when I ran
> mvn package
the final war file didnt have the replacement applied, even though the file under exploded directory had replaced value. The reason for this being the war:war task was run before the replacer:replace task.

Back to the forums and found that I also had to use the maven-war-plugin to get this working.
<plugin>
  <groupid>org.apache.maven.plugins</groupid>
  <artifactid>maven-war-plugin</artifactid>
  <version>2.1.1</version>
  <executions>
    <execution>
      <phase>package</phase>
      <goals><goal>war</goal></goals>
      <configuration>
         <webxml>target/simple-webapp/WEB-INF/web.xml</webxml>
      </configuration>
    </execution>
  </executions>
</plugin>


The forums suggested that I had to use 'exploded' goal, but that didnt do the job for me. I had to use the 'war' goal and also added the webXml element that points to the modified file. This finally got what I wanted.. all the tokens replaced in the target directory and also in the final .war file.
Glad I got it working, but Im still uncertain about the impact of including webXml. Theoretically, I should've had this working without specifying the configuration portion. What if I have to modify a file other that web.xml? I'll edit this after I try that and if I gain any further understanding.

Update: 
Turns out that I really didnt need the <webXml> tag defined within the war plugin. But I did notice that that the war goal runs twice...once by default and once after the replacement. Another note to keep in mind is that since both the plugins (replacer & war) are bound to the same phase (package), the order is important. If the war runs before replacer, then the final war will not have the replaced values. So make sure that the replacer plugin comes AFTER the war plugin.

To replace multiple tokens:
Use the following config in replacer plugin ...

<configuration>
       <includes>
             <include>target/${project.build.finalName}/sample1.txt</include>
             <include>target/${project.build.finalName}/sample2.txt</include>
       </includes>
       <replacements>
               <replacement>
                        <token>token1</token>
                        <value>value1</value>
                </replacement>
                <replacement>
                        <token>token2</token>
                        <value>value2</value>
                </replacement>
        </replacements>                       
</configuration>

Externalize the tokens in a properties file:
Use maven properties plugin to load the properties file as shown below:

<plugin>
        <groupId>org.codehaus.mojo</groupId>
        <artifactId>maven-properties-plugin</artifactId>
        <version>1.0-SNAPSHOT</version>
        <executions>
          <execution>
            <phase>initialize</phase>
            <goals>
              <goal>read-project-properties</goal>
            </goals>
            <configuration>
              <files>
                <file>etc/config/dev.properties</file>
              </files>
            </configuration>
          </execution>
        </executions>
</plugin>

Tuesday, April 27, 2010

Solaris Tuning

Few of the things that I learned while performing load tests on a Solaris box were quite interesting and informative. Though these might be archaic for some, it was refreshing to view the performance limitations of an application from a different angle, rather than the code related issues (inefficient queries, synchronization blocks that are used haphazardly, memory hoggers, etc).
I'll try to list some of them that I used on the Solaris box to get additional information. First one was pretty obvious, since when the load test was run, weblogic server starting coughing up IOException (too many open files).
Running ulimit command showed that the open files settings on the box was too low for a high load. 
> ulimit -a
...
file size (blocks)      unlimited
open files                256
stack size (kbytes)    8192
...
This number should be adjusted based on the load (# of users) and other processes running on the system.
> ulimit -n 1024
This number is applicable only for the current session. If it needs to be set permanently, then rlim_fd_max (default hard limit) needs to be set to that number and system will need to be rebooted to make this effective.
To view the current list of file descriptors used by a specific process, use:
> ls /proc//fd | wc -l

Another helpful command is prstat or top (if available) that displays the top most processes running on the server that utilizes the most CPU.
> prstat -a (displays the CPU intensive processes grouped by user)
> prstat -n 3 -c (limits to top 3 processes and prints below the previous line)

Next comes sar (System Activity Reporter). This command lets you view the system activity and the most interesting one for me was
> sar -u (which displays the CPU utilization activity)

Time   %usr(user)  %sys(system)  %wio(waiting for I/O)  %idle(inactive)
If you want to watch the CPU utilization for every minute for the next 10 minutes, use
> sar -u 60 10
If idle time is consistently 0 or very low, it indicates that the CPU is running short on resources. Also if the wait time is consistently high, it indicates that there could be some stuck threads or blocking threads.

vmstat is another command that provides virtual memory, disk, page and CPU information
It displays the run, blocked & swapped processes and typically you should not see a high number under 'blocked' queue for consecutive reads.

iostat displays the input/output statistics for each disk. If the r/s and w/s is consistently high along with %b (% of time spent on transactions), then the application needs to be tuned to use the io processes more effectively.

To summarize, following are the commands that helped/guided me to gain additional insight into my app:
  • ulimit
  • prstat
  • sar
  • vmstat
  • iostat

Wednesday, April 21, 2010

Remote Monitoring Using jconsole

Recently, we made an architectural change to our app which would store the search results in HttpSession for post-sorting options. To test the impact of this additional memory usage by the app, I had to run a load-test with few hundred users who'd perform random searches and their search results would be saved to their sessions. Locally I was easily able to set up jmx monitoring by adding the following to server start script:

@REM JConsole
set JAVA_OPTIONS=%JAVA_OPTIONS% -Dcom.sun.management.jmxremote -Xmanagement


But when I tried to do the same for the cluster (running on unix server) where I was going to perform the load test, I ran into minor hitch as the server was running on secure layer (https). After few tries..and reading a little bit more from sun site about jconsole setup, I got it working. Here's what I had to include in my managed server startup script:

# JMX Remote Monitoring Settings
JMX_PROPERTIES=" -Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote.port=${JMX_PORT} -Dcom.sun.management.jmxremote.authenticate=false -Dcom.sun.management.jmxremote.ssl=false"
export JMX_PROPERTIES


JMX_PORT - port for each managed server where jconsole should connect for jmx agent.

And on my laptop, I just had to fire of jconsole and enter the following under "Advanced" tab:

service:jmx:rmi:///jndi/rmi://<hostname>:<managed_server_port>/jmxrmi

.. and you are presented with a beautiful sight (or horrendous) based on how well your vm performs. :-)

Monday, December 15, 2008

Elapsed Time in ksh

Being a novice on shell programming (ksh & bash), I've been looking for something short and sweet that would tell me how much time was taken up for my build (ant) to complete. Printing out the time(s) at the beginning of the script and at the end is very simple.. but what I'd wanted to was to print out something like.. " application build took and was completed successfully at

Friday, July 11, 2008

Mylyn is awesome

When I was trying to install an update for an eclipse plugin, I noticed that there were couple of mylyn related stuff present and I had no idea where they came from.. but later realized that it was bundled along with eclipse europa and later versions. Didnt bother much that time.. but couple of days later, ran into a developerworks article that was talking about how mylyn makes you even more productive. That sprouted my curiosity and I took a dig into what it was about. Played around with it a little bit.. and Boy, was I glad to have run into mylyn! It REALLY DOES make you MUCH MORE productive and more focussed.
And its very simple to use as well.
All you need to do is create a new local task (if you dont have a repository setup) and give it a due date, schedule date (based on ur schedule) and a priority. Activate the task and thats pretty much it! From there on, any file that you open to work on are automatically associated with the currently active mylyn task.. and it also makes it easy on you by removing the files that were closed and that havent been edited, but opened for a long time. Another beauty is that it even associates (displays in package explorer) only the methods that were accessed, which means you dont have to really get lost with sorting through the various classes that are present in your project and even the numerous methods in a class that have no relevance to the task you're working on currently! Finally, since mylyn integrates with CVS, if you have eclipse hooked up to CVS, you'll see that when its time to checkin the changes, all the classes impacted by the task you worked on are automatically grouped for you, making it much easier to group check in the files that were updated for a specific fix or for a new change. Just awesome! No more hassle of trying to figure out which files were changed for which task.. especially when you're working on multiple tasks in a huge project. It also tracks the amount of time spent on each task after its activated.. again making it easy for you to provide an estimate or provide better estimates of time spent on each task (for micro-managing bosses). Im yet to play around with the bug tracking feature.. but I already know that Im gonna love that as well.. even if I dont, Im gonna continue using mylyn from now on for creating my personal tasks.. Please give it a try and fall in love with it.
Here's the link that got me onto mylyn:
http://www.ibm.com/developerworks/java/library/j-mylyn1/

Adios.
Arun