Android memory profiler

Diya Vijay
10 min readNov 2, 2023

--

Android memory profiler is used to improve performance of the mobile application, identify the areas which is causing the inefficient use of resources such as the CPU, memory, graphics, network, or the device.

Several profiling tools are offered by the Android Studio to find and visualize potential problems.

  • CPU Profiler →track down runtime performance issues
  • Memory Profiler → track memory allocations
  • Network Profiler → record the sequence in which network requests were sent and received.
  • Energy Profiler → energy usage helps in battery drain.
Android Memory Profiler

Introduction

Memory Profiler is a tool in Android Studio that allows developers to monitor and analyze the memory usage of their Android apps. It helps detect the memory allocation, memory usage, and memory released by the app. This helps in identifying the memory leaks, excessive memory consumption, and other memory-related issues. It provides a graphical interface that allows you to monitor and analyze the memory usage of your app.

The Memory Profiler consists of various components, including the Memory Timeline and Heap Dump, which provide valuable insights into memory allocation, deallocation, and potential memory leaks.

Memory leaks and excessive memory usage can lead to app crashes, sluggishness, and poor user experience. With the help of memory profiler, we can identify and resolve memory related problems, optimize memory usage, and ensure the app runs smoothly on different devices.

Getting Started

Ensure you have Android Studio installed and set up your project.

  • Go to menu bar.
  • View →Tool Windows→ Profiler
Profiler

OR

  • Click profile in the toolbar.
Toolbar
  • Select the required device and app process from the android profiler toolbar. (Ensure enabling USB debugging, if connecting device over USB)
  • Click anywhere in the memory timeline to open the memory profiler.

Common Terminology

  • Heap: A region of memory where memory is dynamically allocated during runtime.
  • Java Heap: Memory space to sore java objects. It is managed by JVM (Java Virtual Machine). It is divided into different regions like young generation and old generation.
  • Memory Timeline: Visual representation of memory usage over time. Memory allocation and deallocation change during the execution of an app is represented with the help of memory timeline.
  • Memory Allocation: Process of assigning memory to objects during runtime. It is reserve memory space for objects to store their data and reference other objects.
  • Native Heap: The memory space used for storing native objects and data structures which is generally allocated using native code or libraries. The memory used by C/C++ code or resources like bitmaps.
  • Memory Deallocation: The process of releasing memory that is no longer needed by objects.
  • Allocation Tracker: It allows to track object allocations during app execution. It provides information about object types, allocation call stacks, and memory consumed by specific objects.
  • Retained Objects: Retained objects are objects that are still referenced and held in memory, even though they are no longer needed by the app. It helps in detecting memory leaks.
  • Dominators: Objects in the memory heap that have references to a large number of other objects. They indicate potential memory issues, such as memory leaks or excessive memory usage.

For example: Imagine we have shopping app where we create instances of the product class. The product instances contain information such as product name, price, and description. Now, let’s say we have Shopping Cart object that holds references to multiple product objects.

Here the Shopping Cart object is considered as dominator because it has references to multiple product objects. It influences the existence and memory usage of those product objects.

  • Reference Chains: The links between objects that refer to each other, creating a chain of object references. It helps in understanding object relationships and identifying potential memory leaks.
  • Memory Leaks: Memory leaks occur when objects are unintentionally retained in memory and not properly released when they are no longer needed.

Profiling an app with the memory profiler

To enable profiling an app with the memory profiler we need to initiate a profiling session.

  • Select the app and choose the target device emulator.
  • After the activation of profiling session, we can capture the memory snapshots at different stages of app execution in order to analyze memory usage over time.
  • Capturing memory snapshots: We can manually capture memory snapshots at specific points during app execution by clicking on the “Dump Java Heap” button.
  • Analyzing Memory Usage: The captured memory snapshots are displayed in the Memory Profiler interface. We can analyze memory usage by examining various metrics like heap size, allocation, and memory heap contents.
  • The memory timeline provides a visual representation of memory changes over time, allowing us to identify memory spikes, excessive allocations, and memory leaks.
  • Inspecting Objects and References: We can inspect individual objects and their reference in memory by selecting a specific memory snapshot and explore the objects present in the heap at that moment. We can view object details, analyze their reference chains, and understand their impact on memory consumption.
  • Tacking Memory Allocations: To track the memory allocations in the memory profiler, we can enable the allocation tracking to monitor object creation and destruction during app execution. This helps in identifying areas of code that generate excessive object allocations, allowing us to optimize memory usage by minimizing unnecessary object creation.
  • Identifying Memory Leaks: The Memory profiler helps in identifying memory leaks. Memory leaks are occurred when objects are unintentionally retained in memory and leads to memory consumption with time.

Understanding Memory Types and Optimization

Different memory types and optimization are as follows:

Java Heap: The Java Heap is the memory region where Java objects are allocated. It is managed by the Java Virtual Machine (JVM) and consists of two generations:

  • Young Generation → Here new objects are created.
  • Old Generation → It holds objects that have gone through multiple garbage collection cycles.

To optimize java heap memory, focus on reducing unnecessary object creation, minimizing object retention, and optimizing garbage collection. Object pooling, using lightweight data structures, and reusing objects can help reduce memory allocations and improve performance.

Native Memory: The memory allocated outside the Java heap is Native memory. It is used to storing native code, libraries, and data structures. It is allocating using languages like C or C++ with the help of Android NDK (Native Development Kit).

The efficient memory management of native involves proper allocation and deallocation of memory resources using native code. Tracking native memory usage, implementing reference counting, and releasing resources appropriately can help optimize native memory.

Bitmaps: A specific type of native memory used to store and manipulate images in Android. Optimizing bitmap memory usage involves techniques such as down sampling, caching, and recycling bitmaps when they are no longer needed.

Bitmaps in Android are managed by the application itself, but their memory is allocated and controlled by the OS and the underlying graphics subsystem.

Pros -

  • The bitmap can store images of different formats and provide various operations for image processing, such as scaling, cropping, and applying filters.
  • It can be easily rendered on the screen and displayed in various UI components, such as ImageViews.
  • Android provides mechanism for bitmap caching, where bitmap can be cached in memory or on disk to improve performance and reduce the need for repeated image decoding.

Cons -

  • Memory Consumption
  • Limited Heap Size
  • Memory Management Challenges — bitmaps require explicit memory management in android apps.
  • Performance Impact

Advanced Memory Analysis

Advanced memory analysis techniques can provide deeper insights into memory usage and help identify and resolve complex memory related issues in android applications.

  • Heap Dump Analysis: A heap dump is a snapshot of the entire memory heap at specific point in time. It allows to examine the memory usage in detail, including object instances, their sizes, and references.

Heap Dump snapshot table terms:

Heap Dump

Allocations refer to the amount of memory that has been allocated for a specific object or class. It is measure in bytes.

Native Size refers to the amount of memory occupied by native objects. It provides insights into the memory usage that is not directly managed by the java heap.

Shallow Size represents the memory consumption of an object itself in bytes.

Retained Size includes the memory occupied by an object and all objects reachable through it and is measured by bytes.

NOTE: Heap analysis tools may provide options to display these sizes in different units, such as KB, MB, or GB, depending on the scale of the memory usage being analyzed.

  • Reference Analysis: Objects are linked together, like a chain and sometimes, some objects in the chain are still connected even when they shouldn’t be which causing memory leaks. Analyzing object references can reveal how objects are related and identify potential memory leaks.
  • Allocation Tracking: Allocation tracking allows to monitor object allocations and deallocations during app execution. With the help of allocation tracking, we can get information about where and how objects are created and helps in identifying areas of code that generate excessive object allocations.
  • Garbage Collection Analysis: Cleaning up unused objects in app’s memory. Analyzing garbage collection (GC) behavior can provide insights into memory management efficiency. With the help of GC logs or using profiling tools, we can understand the frequency, duration, and impact of garbage collection pauses on app’s performance.

The Android Runtime (ART) and Dalvik (Just-In-Time) virtual machine or managed memory environment keeps track of each memory allocation. JVM decides which object is no longer useful.

Cons:

  • Performance Impact: GC process itself consume computational resources and temporarily halts to perform garbage collections.
  • Lack of Control: lack of direct control over memory allocation and deallocation.
  • Tunning complexity: optimizing the GC behavior and tuning its parameters to best fit an application’s specific memory characteristics and workload can be a complex task.
  • Memory Profiling APIs: Android provides a special tool that helps in gathering information about memory usage while it is running means during app runtime. These API enable to collect custom memory metrics, track memory allocations, and implement our own memory analysis logic.
  • Performance Testing: Conducting performance testing with memory analysis help identifying memory related bottlenecks and performance issues.

FAQs

Q1) How to automate memory profiling in Android Studio?

A1) We can use CLI tools like dumpsys meminfo or Android Debug Bridge (ADB) to collect memory-related information from app.

Q2) How CPU works for Android Apps?

A2) CPU is responsible to execute instructions and perform calculations necessary for running android apps. Here’s how CPU works for android apps.

Instruction Fetch: The CPU fetches instructions from the device’s memory. These instructions are stored in the form of machine code, which is a sequence of binary instructions that the CPU understand.

Instruction Decoding: The CPU decodes the fetched instructions, determining the operation to be performed and the operands involved. It breaks down the instructions into smaller steps that can be executed by different CPU components.

Data Fetch and Execution: The CPU fetches the data required for executing the instruction from the memory or caches. It performs arithmetic or logical operations on the data based on the instruction’s requirements.

Contril Flow: The CPU manages the control flow of instructions, determining the order in which instructions are executed. It updates the program counter, which keeps track of the memory address of the next instruction to be fetched.

Caching: The CPU utilizes different levels of cache memory to store frequently accessed instructions and data. Caches are faster than main memory, enabling the CPU to reduce the time required to fetch data from slower RAM. Caches help improve performance by reducing memory latency.

Multithreading: Many modern CPUs support multithreading, allowing them to execute multiple threads simultaneously. In android, this can be beneficial for running multiple apps or processes concurrently.

Power Management: The CPU is also responsible for power consumption in Android devices. It dynamically adjusts its frequency and voltage based on the workload to balance performance and energy efficiency.

Q3) What is the difference between heap and RAM?

A3) Here’s the difference between heap and RAM.

RAM (Random Access Memory)

RAM is a physical hardware component in a computer system that provides temporary storage for data and instructions that are actively being used by the CPU. It is a volatile memory, meaning its contents are lost when the power is turned off.

RAM serves as the primary memory for the computer, and it holds the operating system, currently running programs, and data needed for their execution. When you run an Android app, it gets loaded into RAM along with the necessary system resources to execute the app.

RAM allows for quick and random access to data, facilitating faster read and write operations compared to other storage devices like hard drives. However, RAM is limited in capacity and more expensive than secondary storage options.

Heap

The heap, on the other hand, is a specific region of a computer’s memory where dynamically allocated memory is managed. It is a part of the RAM that is dedicated to storing dynamically allocated objects and data structures during program execution.

In the context of programming, the heap is primarily used for managing objects created at runtime. When an Android app runs, it may allocate memory from the heap to create objects, such as variables, arrays, or complex data structures, as needed.

The heap is structured in a way that allows for flexible memory allocation and deallocation. Memory is allocated from the heap as requested and released when no longer needed, through mechanisms like garbage collection.

Q4) What is OOM?

A4) OOM stands for Out-of-memory. It refers to a situation in which a computer system or program exhausted all available memory resources and is unable to allocate additional memory for executing tasks or storing data. When an OOM occurs, the system or program may become unstable or crash.

In android apps, an OOM error occurs when an app consumes excessive amounts of memory, causing the system to run out of available memory resources.

Due to OOM the app being terminated by the system. The Android operating system includes a mechanism called the “Out of Memory Killer” (OOM Killer) that monitors memory usage and terminates apps that consume excessive memory to free up resources for other processes and system functions.

OOMs are divided into two parts depending on the app’s state when they occur.

  • Background OOMs (BOOMs) — caused by the app in the foreground consuming excess memory than by excessive memory use by an app in the background.
  • Foreground OOMs (FOOMs) — App using too much memory.

--

--

Diya Vijay
Diya Vijay

Written by Diya Vijay

SWE intern @ Microsoft | Gold MLSA | Top 50 @ Samsung Solve For Tomorrow | Gold Microsoft Learn Student Ambassador

No responses yet