Abstract:
Augmented Reality (AR) is a technology that can combine real and virtual objects. Interaction is essential in AR, allowing the user to interact with a virtual object in the natural environment. There is various type of gesture interaction that can be applied in AR. The use of gesture interaction in drawing space has become a popular topic among researchers to apply this concept to AR technology by using bare hands to interact directly with AR. However, the visual and spatial complexity of the canvas when involves a real-time drawing canvas for AR still unsolved issue and using real hand gestures on a handheld device still challenging. The implementation approach for the AR Drawing application, as well as hand gesture recognition to collect gesture input for the AR drawing space, is detailed in this paper. The gesture input needed to be conveyed to the portable device, hence the integration procedure took place over a local network. The application flows have been documented, and the user interface (UI) consists of three screens. Brushstroke, thickness slider, and colour are among the elements in this AR drawing space. For AR, ARCore has been used to enable tracking. ARCore is a motion-based tracking approach that does not require the use of a physical marker. It is also known as markerless tracking. By bringing attention to an unsolved problem that impacts a gesture inputs. The compatibility to convert from traditional drawing into digital drawing space and the use of hand gestures as input is challenging. Therefore, this paper explains the process to actualize the real-time drawing in handheld AR using the user’s real hand.