Point clouds are a dense compilation of millions of points that can advance content creation and interaction in various emerging applications such as Augmented Reality (AR). However, point clouds consist of per-point real-world spatial and color information that are too computationally intensive to meet real-time specifications, especially on mobile devices. To stream dense point cloud (PtCl) to mobile devices, existing solutions encode pre-captured point clouds, yet with PtCl capturing treated as a separate offline operation. To discover more insights, we combine PtCl capturing and streaming as an entire pipeline and build a research prototype to study the bottlenecks of its real-time usage on mobile devices, consisting of a depth sensor with high precision and resolution, an edge-computing development board, and a smartphone. In a custom Unity app, we monitor the latency of each operation from the capturing to the rendering, as well as the energy efficiency of the board and the smartphone working at different point cloud resolutions. Results reveal that a toolset helping users efficiently capture, stream, and process color and depth data is the key enabler to real-time PtCl capturing and streaming on mobile devices.