Recently while testing an Android mobile application our team was developing, we encountered some performance issues in two of our List Views. Fortunately, the Android SDK provides mature debugging tools to help investigate performance problems. In this post, I'll describe how we used Android's toolset to locate the source of our problems and verify the optimizations we made were correct. I'll also share a few tips to keep in mind while tracing your own Android applications.

Some Background

Capture of Scores View Our team recently had the privilege of partnering with Turner Sports to deliver the Android NBA Game Time application for NBA digital. To date, the application has received a number of positive reviews including distinction as Gizmodo’s Android App of the Week the first week of April.

Our road to success wasn’t without some effort though. During our testing of the application, we found some undesirable performance when scrolling on the Scores listing screen and the Scheduled games screens, both of which are custom ListView Activities. If you’re not familiar with the application, a screenshot of the Scores view is included here. Because we receive data from live feeds and the views are more complicated than traditional Android ListViews, we needed to write our own ListView Adapter. Unfortunately, this implementation didn’t scroll as smoothly as we wanted and it would eat up more memory than we expected it to. The immediate thought was we had a memory leak somewhere, but we needed to use some tools to confirm that possibility.

How We Found the Problem

We used Android’s tracing API to investigate where the problems were. If you’re familiar with Java profiling tools such as JProbe, Android’s TraceView provides basic method tracing functionality, albeit with more work required by the developer. I followed guidance of the TraceView documentation to configure our application for tracing. Here are the steps I took.

In our Scores activity, I added start & stop tracing calls to trigger tracing. Below is a code snippet of how I accomplished this in our ListActivity. I decided to start tracing before we did anything on our activity and after we’d finished all of our shutdown code to ensure we’d trace everything our superclasses were doing too.

	public class ScoresActivity extends ListActivity {
		public void onStart() {
    		// start tracing to “/sdcard/scores.trace”
    		Debug.startMethodTracing(“scores”);
			super.onStart();
  			// other start up code here…
        }

		public void onStop() {
			super.onStop();
			// other shutdown code here
			Debug.stopMethodTracing();
		}

		// Remainder of activity implementation
    }
 

After adding the tracing statements to the target activity, I rebuilt the application and deployed it on a developer phone. If you don’t have a developer device available, you can also use the built-in emulator, but you’ll need to follow a few steps. The TraceView documentation details the additional steps here. Once installed, I began performing the actions that were producing the undesirable results. Once I’d finished running the activity, a trace file was available for me to download to my development machine. I pulled the trace file and loaded it in TraceView using the following commands:

	localhost$ adb pull /sdcard/scores.trace scores.before.trace
 	localhost$ traceview scores.before.trace	
 

Below is a screenshot of TraceView showing the profiling results before applying the fixes. Looking at the output, you can see our application was spending almost 50% of it’s entire running time in the method that’s responsible for drawing each game tile in Scores view. These results showed that we could improve upon our original design, which was based on the Efficient List View Adapter example in the SDK documentation.

Scores View Profiling Results before Fixes

Eventually we settled on some enhancements that we believed would improve the performance and memory usage of the application. Of course, we needed to confirm that our fixes addressed the problems we were seeing and more importantly didn’t make performance worse. Normal testing with the heap analyzer included with the SDK showed that the performance was noticeably better for the Scores view now. We still needed further proof that the fixes worked, so I again enabled tracing for the application, performed similar testing as I’d done before our fixes were applied. Below is a screenshot of our results in TraceView.

Scores View Profiling Results after Fixes

TraceView confirms that we improved relative timing considerably. Not only do 19 other Android methods take more time than our first implemented method (we were 7th before the fixes), we decreased the relative time spent in our ListViewAdapter#getView() to 10%! Further testing confirmed our fixes were solid, so they became part of the release version of the application. In a future post, I’ll share our solution as I suspect other Android developers will run into the types of problems we did when using complicated list view items.

Performance Tuning Tips

If you’re not familiar with method tracing tools, the information available can be daunting at first. Here are a few tips you can use to ensure you’re able to spot issues quickly using the profiling tools.

Remember, Timing is not Absolute

When tracing, the application will run much more slowly than it would under normal operation, so you cannot rely on the exact timing provided. You can use the percentage of time spent to track relative gains and losses in performance to the rest of your application though. Percentages will give you a fairly reliable metric to use in between runs.

Be Wary of Expensive Methods

If you notice a large percentage of time being spent in one or more of your methods, it’s likely there is a bottleneck you can improve. In Android’s TraceView, these methods will often appear towards the top of the list. There are no hard rules on what percentage is too high, but it’s worth investigating methods that monopolize more than 15-20% of total running time.

Don’t Optimize Unless You Need it

I’ve had tremendous success in all types of applications by first keeping the design simple and optimizing later (only when needed). Unless there are painfully obvious performance issues during development, I’ve found it better to defer tuning until after the majority of application functionality is written. This approach works well for the majority of applications because it ensures that most of the time is spent on developing features the user directly notices.

comments powered by Disqus