6. DISCUSSION AND EVALUATION
As discussed earlier, there are various advantages and disad-
vantages when comparing different digital scratching meth-
ods. In general, however, it is difficult to objectively evalu-
ate new methods and compare against past work. Just as a
violinist gets a custom to the feel and sound of their instru-
ment, a DJ will learn the subtleties of a given scratching
method and can be averse to change [5]. Informal perfor-
mance testing, nevertheless, showed promising results with
minimal perceived latency between input gesture data and
output audio playback.
General measures of evaluation included precision and re-
sponsiveness as well as stability. Rapid physical gestures
were seen to be very responsive and have precise corre-
sponding audio effect. Repeated physical gestures were also
found to have a consistent sounding effect over long perfor-
mance times. Further testing with professional level DJs,
however, is needed for a more complete evaluation. For
a video demonstration of the system in action please see
http://ccrma.stanford.edu/~njb/research/turntable/.
The one-way network latency time between a given phone
and host computer was measured to be on average 3-5 ms
and compares favorably with professional audio recording
equipment. When comparing the maximum humanly-possible
scratch rate (10-20 turns per second [17]), the 100 Hz sam-
ple rate of the accelerometer and gyroscope appears suit-
able. The perceived effect of accelerometer and gyroscope
latency, however, is difficult to measure and dependent on
the sensor filtering method used, requiring further study
and user evaluation.
7. CONCLUSIONS
A straightforward and surprisingly effective method of digi-
tal scratching is presented. The proposed method leverages
existing analog turntables as a physical interface and takes
advantage of the capabilities of modern sensor–equipped
smartphones, resulting in a genuinely physical, wireless sensing-
based scratching method. Benefits include digital audio and
storage, minimal additional hardware, familiar propriocep-
tive feedback, and a single interface to control both digital
and analog audio. Further benefits include visual display,
gesture modification, and the possibility of interactions un-
tethered from the turntable. Testing and evaluation show
this approach to be viable and promising.
8. ACKNOWLEDGMENTS
This work was enabled by National Science Foundation Cre-
ative IT grant No. IIS-0855758 as well as the funding from
the School of Humanities and Sciences, Stanford University.
Additional thanks to Professor Jonathan S. Abel for valu-
able conversation regarding the tone arm control. Finally,
a thank you to the anonymous reviewers for valuable feed-
back regarding the application of active listening and track
switching, among other observations.
9. REFERENCES
[1] Ms. Pinky, January 2011. http://www.mspinky.com/.
[2] Native Instruments, January 2011.
http://www.native-instruments.com/.
[3] Rane, January 2011. http://www.rane.com.
[4] Stanton, January 2011. http://www.stantondj.com/.
[5] T. Beamish. D’Groove - a novel digital haptic
turntable for music control. Master’s thesis, UBC,
2004.
[6] R. Bencina. oscpack, Nov. 2006.
http://www.audiomulch.com/~rossb/code/oscpack/.
[7] N. J. Bryan, J. Herrera, J. Oh, and G. Wang. MoMu:
A mobile music toolkit. In Proceedings of the
International Conference on New Interfaces for
Musical Expression (NIME), Sydney, Australia, 2010.
[8] A. Camurri, C. Canepa, and G. Volpe. Active
listening to a virtual orchestra through an expressive
gestural interface: the orchestra explorer. In
Proceedings of the 7th international conference on
New interfaces for musical expression, NIME ’07,
pages 56–61, New York, NY, USA, 2007. ACM.
[9] A. Camurri, G. Volpe, H. Vinet, R. Bresin,
M. Fabiani, G. Dubus, E. Maestre, J. Llop,
J. Kleimola, S. Oksanen, V. V
¨
alim
¨
aki, and
J. Seppanen. User-centric context-aware mobile
applications for embodied music listening. In
O. Akan, P. Bellavista, J. Cao, F. Dressler, D. Ferrari,
M. Gerla, H. Kobayashi, S. Palazzo, S. Sahni, X. S.
Shen, M. Stan, J. Xiaohua, A. Zomaya, G. Coulson,
P. Daras, and O. M. Ibarra, editors, User Centric
Media, volume 40 of Lecture Notes of the Institute for
Computer Sciences, Social Informatics and
Telecommunications Engineering, pages 21–30.
Springer Berlin Heidelberg, 2010.
[10] P. R. Cook and G. P. Scavone. The Synthesis ToolKit
(STK). In Proceedings of the International Computer
Music Conference, Beijing, China, 1999.
[11] N. Gillian, S. O’Modhrain, and G. Essl. Scratch-off:
A gesture based mobile music game with tactile
feedback. In Proceedings of the 2009 conference on
New Interfaces for Musical Expression, NIME ’09,
pages 308–311, 2009.
[12] M. Hans, A. Slayden, M. Smith, B. Banerjee, and
A. Gupta. Djammer: a new digital, mobile, virtual,
personal musical instrument. Multimedia and Expo,
IEEE International Conference on, 0:4 pp., 2005.
[13] M. C. Hans and M. T. Smith. Interacting with audio
streams for entertainment and communication. In
Proceedings of the eleventh ACM international
conference on Multimedia, MULTIMEDIA ’03, pages
539–545, New York, NY, USA, 2003. ACM.
[14] M. C. Hans and M. T. Smith. A wearable networked
MP3 player and “turntable” for collaborative
scratching. Wearable Computers, IEEE International
Symposium, 0:138, 2003.
[15] K. F. Hansen. The acoustics and performance of DJ
scratching. PhD thesis, KTH Royal Institute of
Technology, 2010.
[16] K. F. Hansen, M. Alonso, and S. Dimitrov.
Combining dj scratching, tangible interfaces and a
physics-based model of friction sounds. In Proceedings
of the International Computer Music Conference,
pages 45–48, 2007.
[17] K. F. Hansen and R. Bresin. Analysis of a genuine
scratch performance. In In Proceedings of the Gesture
Workshop, pages 519–528, 2003.
[18] K. F. Hansen and R. Bresin. The skipproof virtual
turntable for high-level control of scratching. Comput.
Music J., 34:39–50, June 2010.
[19] H. Ishii and B. Ullmer. Tangible bits: towards
seamless interfaces between people, bits and atoms. In
Proceedings of the SIGCHI conference on Human
factors in computing systems, CHI ’97, pages 234–241,
New York, NY, USA, 1997. ACM.
[20] S. Jord`a, M. Kaltenbrunner, G. Geiger, and
R. Bencina. The reacTable. In In Proceedings of the
International Computer Music Conference (ICMC
2005, pages 579–582, 2005.