Download txt - UBC Computer Science

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
543 Bibliography Updates
updated 1 Mar, 2014
Ultimate destination: Mendeley database
GENERAL
[Karam 2013] L. Karam, B. Kleijn, and K. MacLean, eds. Perception-based Media
Processing. Vol. 101:9. 2013, Special Issue, Proceedings of the IEEE.
http://dx.doi.org/10.1109/jproc.2013.2276451
MANUAL INTERACTION DESIGN
[Rogers 2005] W. A. Rogers, A. D. Fisk, A. C. McLaughlin, and R. Pak, “Touch a
Screen or Turn a Knob: Choosing the Best Device for the Job,” Human Factors,
vol. 47:2, pp. 271-288, 2005.
http://global.factiva.com/
[Poupyrev 2002] I. Poupyrev, S. Maruyama, and J. Rekimoto, “Ambient touch:
designing tactile interfaces for handheld devices,” in Proc. of the 15th Ann ACM
Symp on User Interface Software and Technology (UIST'02), pp. 51–60, 2002.
http://portal.acm.org/citation.cfm?id=571985.571993
[Poupyrev 2003] I. Poupyrev and S. Maruyama, “Tactile interfaces for small touch
screens,” in Proc. of 16th annual ACM Symposium on User Interface Software
and Technology (UIST'03), pp. 217–220, 2003.
http://portal.acm.org/citation.cfm?id=964696.964721
HAPTIC SKETCHING AND RAPID PROTOTYPING
[Tietz 2008] W. Tietz, “Haptic design of vehicle interiors at AUDI,” in Human Haptic
Perception: Basics and Applications, M. Grunwald, Ed.: Birkhäuser Basel, 2008,
pp. 439-444.
http://dx.doi.org/10.1007/978-3-7643-7612-3_35
HAPTIC RENDERING
[Salisbury 1997] J. K. Salisbury and M. A. Srinivasan, “Phantom-based haptic
interaction with virtual objects,” Computer Graphics and Applications, IEEE, vol.
17:5, pp. 6-10, 1997.
http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=1626171&tag=1
[Salisbury 1997] K. Salisbury and C. Tarr, “Haptic Rendering of Surfaces Defined by
Implicit Functions,” in Proc. of 6th Annual Symposium on haptic interfaces for
virtual environment and teleoperator systems, Dallas, TX, pp. 61-68, 1997.
https://dac.escet.urjc.es/docencia/RVA/06-07/salisbury97_HAPTIC%20RENDERING%2
0OF%20SURFACES%20DEFINED%20BY%20IMPLICIT%20FUNCTIONS_implicit1.pd
f
[Zilles 1995] C. B. Zilles and J. K. Salisbury, “A Constraint-based God-Object Method
for Haptic Display,” in Proc. of IEEE Conference on Intelligent Robots and
Systems (IROS '95), 1995, PDF.
http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=525876
HAPTIC PSYCHOPHYSICS
[Hayward 2008] V. Hayward, “A brief taxonomy of tactile illusions and demonstrations
that can be done in a hardware store,” Brain Research Bulletin (Special Issue on
Robotics and Neuroscience), vol. 75:6, pp. 742-752, 2008.
[Klatzky 2003] R. L. Klatzky and S. J. Lederman, “Touch,” in Experimental
Psychology, vol. 4, Handbook of Psychology, A. F. Healy and R. W. Proctor, Eds.
New York: John Wiley & Sons, 2003, pp. 147-176.
[Robles-De-La-Torre 2001] G. Robles-De-La-Torre and V. Hayward, “Force can
overcome object geometry in the perception of shape through active touch,”
Nature, vol. 412, pp. 445–448, 2001.
http://www.nature.com/nature/journal/v412/n6845/abs/412445a0.html
[Gescheider 1997] G. A. Gescheider, Psychophysics: The Fundamentals, 3rd ed.
Mahwah, NJ, USA: Lawrence Erlbaum Associate, 1997.
http://psycnet.apa.org/psycinfo/1997-08651-000
[Tan 1992] H. Z. Tan, X. D. Pang, and N. I. Durlach, “Manual Resolution of Length,
Force, and Compliance,” in Proc. of the 1st Ann. Symp. on Haptic Interfaces for
Virtual Environment and Teleoperator Systems, ASME/IMECE, 42, pp. 13-18,
1992.
https://engineering.purdue.edu/~hongtan/pubs/PDFfiles/C05_Tan_ASME1992.pdf
CROSSMODAL PERCEPTION
[Alais 2004] D. Alais, C. Morrone, and D. Burr, “Separate attentional resources for
vision and audition,” J. Neuroscience, vol. 273:1592, pp. 1-12, 2004.
http://www.journals.royalsoc.ac.uk/content/q13x4459041347v6/
[Adelstein 2003] B. D. Adelstein, D. R. Begault, M. R. Anderson, and E. M. Wenzel,
“Sensitivity to haptic-audio asynchrony,” in Proceedings of the 5th international
conference on Multimodal interfaces. Vancouver, British Columbia, Canada:
ACM, 2003, pp. 73-76, 958448.
http://dl.acm.org/citation.cfm?id=958448
[Bach-y-Rita 2003] P. Bach-y-Rita, M. E. Tyler, and K. A. Kaczmarek, “Seeing with
the Brain,” International Journal of Human-Computer Interaction, vol. 15:2, pp.
285-295, 2003.
http://dx.doi.org/10.1207/S15327590IJHC1502_6
[Wu 1999] W.-C. Wu, C. Basdogan, and M. A. Srinivasan, “Visual, Haptic and
Bimodal Perception of Size and Stiffness in Virtual Environments,” in Proc. of the
8th Ann. Symp. on Haptic Interfaces for Virtual Environment and Teleoperator
Systems, ASME/IMECE, Nashville, TN, DSC:67, pp. 19-26, 1999.
http://touchlab.mit.edu/publications/1999_011.pdf
[Ernst 2002] M. O. Ernst and M. S. Banks, “Humans integrate visual and haptic
information in a statistically optimal fashion,” Nature, vol. 415:Letters, pp.
429–433, 2002.
http://www.nature.com/nature/journal/v415/n6870/pdf/415429a.pdf
[Jaimes 2007] A. Jaimes and N. Sebe, “Multimodal human‚Äìcomputer interaction: A
survey,” Computer Vision and Image Understanding, vol. 108:1‚Äì2, pp. 116-134,
2007.
http://dx.doi.org/10.1016/j.cviu.2006.10.019
[Frassinetti 2002] F. Frassinetti, N. Bolognini, and E. L√†davas, “Enhancement of
visual perception by crossmodal visuo-auditory interaction,” Experimental Brain
Research, vol. 147:3, pp. 332-343, 2002.
http://dx.doi.org/10.1007/s00221-002-1262-y
[Macaluso 2001] E. Macaluso and J. Driver, “Spatial attention and crossmodal
interactions between vision and touch,” Neuropsychologia, vol. 39:12, pp.
1304-1316, 2001.
http://www.sciencedirect.com/science/article/B6T0D-43YH3SD-6/2/fa4b8a23f7d330898
694cbf2a131ccbb
[Molholm 2002] S. Molholm, W. Ritter, M. M. Murray, D. C. Javitt, C. E. Schroeder,
and J. J. Foxe, “Multisensory auditory‚Äìvisual interactions during early sensory
processing in humans: a high-density electrical mapping study,” Cognitive Brain
Research, vol. 14:1, pp. 115-128, 2002.
http://www.sciencedirect.com/science/article/pii/S0926641002000666
[Oviatt 1999] S. Oviatt, “Ten myths of multimodal interaction,” Communications of the
ACM, vol. 42:11, pp. 74-81, 1999.
http://dl.acm.org/citation.cfm?id=319398
[Welch 1980] R. B. Welch and D. H. Warren, “Immediate perceptual response to
intersensory discrepancy,” Psychological Bulletin, vol. 88:3, pp. 638-667, 1980.
http://psycnet.apa.org/journals/bul/88/3/638/
[Von Bekesy 1959] G. Von Bekesy, “Similarities between hearing and skin
sensations,” Psychological Review, vol. 66:1, pp. 1-22, 1959.
http://psycnet.apa.org/journals/rev/66/1/1/
MEASUREMENT / METRICS / DESIGN
Ernest D. Fasse and Neville Hogan. Quantitative Measurement of
Haptic Perception. Proceedings of the IEEE International Conference
on Robotics and Automation, pages 3199ñ3204, 1994.
Vincent Hayward and Oliver R. Astley. Performance Measures for
Haptic Interfaces. Robotics Research: The 7th Int. Symposium,
1:195ñ207, 1996 1996.
A.E. Kirkpatrick and S.A. Douglas. Application-based evaluation
of haptic interfaces. Orlando, FL, 24-25 March 2002.
D.A. Lawrence, L.Y. Pao, A.M. Dougherty, M.A. Salada, and
Y. Pavlou. Rate-hardness: a new performance metric for haptic
interfaces. Robotics and Automation, IEEE Transactions on,
16(4):357 ñ 371, Aug. 2000.
Karun B. Shimoga. A Survey of Percepual Feedback Issues in Dextrous
Telemanipulation: Part II. Finger Touch Feedback. Proceedings
of the IEEE Virtual Reality Annual International Symposium
(VRAIS), Seattle, Washington, pages 271ñ279, 1993.
SPECIFIC DEVICES
I. R. Summers and C. M. Chanter. A broadband tactile array on the fingertip. Journal
of the Acoustical Society of America, 112:2118–2126, 2002.
R. Cholewiak and C. Sherrick. A computer-controlled matrix system
for presentation to skin of complex spatiotemporal pattern. Behavior
Research Methods and Instrumentation, 13(5):667–673, 1981.
[JL sez this is the method we used to characterize THMB???]
TOUCH SENSING TECHNOLOGY
[Boland 2010] J. J. Boland, “Flexible electronics: Within touch of artificial skin,” Nature
Materials, vol. 9:10, pp. 790-792, 2010.
http://dx.doi.org/10.1038/nmat2861
WEARABLE AND TANGIBLE COMPUTING
[Ni 2009] T. Ni and P. Baudisch, “Disappearing mobile devices,” in Proceedings of the
22nd annual ACM symposium on User interface software and technology.
Victoria, BC, Canada: ACM, 2009, pp. 101-110, 1622197.
http://dl.acm.org/citation.cfm?id=1622197
[Linz 2006] T. Linz, C. Kallmayer, R. Aschenbrenner, and H. Reichl, “Fully Integrated
EKG Shirt based on Embroidered Electrical Interconnections with Conductive
Yarn and Miniaturized Flexible Electronics,” in Proc. of Int'l Workshop on
Wearable and Implantable Body Sensor Networks (BSN’06), 2006.
http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1612887&isnumber=33861
[Klemmer 2009] S. R. Klemmer and J. A. Landay, “Toolkit Support for Integrating
Physical and Digital Interactions,” Human-Computer Interaction, vol. 24:3, pp.
315-366, 2009.
http://www.tandfonline.com/doi/abs/10.1080/07370020902990428
AFFECTIVE INTERFACES
[Bates 1994] The role of emotion in believable agents, Communications of the ACM,
v.37 n.7, p.122-125, July 1994
Lindgaard, G., & Whitfield, T. W. A. (2004). Integrating aesthetics within an
evolutionary and psychological framework. Theoretical Issues in Ergonomics Science,
5(1), 73-90.
INTERPRETATION OF GESTURE AND MOVEMENT: AFFECT, HRI
[Gallace 2010] A. Gallace and C. Spence, “The science of interpersonal touch: An
overview,” Neuroscience & Biobehavioral Reviews, vol. 34:2, pp. 246-259, 2010.
http://www.sciencedirect.com/science/article/pii/S0149763408001723
[Duda 2000] R. O. Duda, P. E. Hart, and D. G. Stork, Pattern Classification, 2 ed:
Wiley-Interscience, 2000.
http://www.amazon.ca/Pattern-Classification-Richard-Duda/dp/0471056693/ref=sr_1_2?
s=books&ie=UTF8&qid=1329418757&sr=1-2
[Efron 1972] Gesture, race and culture : a tentative study of the spatio-temporal and
"linguistic" aspects of the gestural behavior of eastern Jews and southern Italians in
New York City, living under similar as well as different environmental conditions. /
Sketches by Stuyvesant van Veen Mouton, The Hague, 1972.
[Kendon 2004] Gesture: Visible Action as Utterance. Cambridge: Cambridge University
Press, 2004.
[Bremner 2009] Bremner, P.; Pipe, A.; Melhuish, C.; Fraser, M.; Subramanian, S.; ,
"Conversational gestures in human-robot interaction," Systems, Man and Cybernetics,
2009. SMC 2009. IEEE International Conference on , vol., no., pp.1645-1649, 11-14
Oct. 2009
[Ennis 2010] Cathy Ennis , Rachel McDonnell , Carol O'Sullivan, Seeing is believing:
body motion dominates in multisensory conversations, ACM Transactions on Graphics
(TOG), v.29 n.4, July 2010
[Lasseter 1987] "Principles of Traditional Animation Applied to 3D Computer Animation",
Computer Graphics, pp. 35-44, 21:4, July 1987 (SIGGRAPH 87).
[Pirie 1995] Meaning through Motion: Kinesthetic English. The English Journal, Vol. 84,
No. 8, Multiple Intelligences (Dec., 1995), pp. 46-51
MEASURING AFFECT; PHYSIOLOGICAL COMPUTING
[Mauss 2009] I. B. Mauss and M. D. Robinson, “Measures of emotion: A review,”
Cognition & Emotion, vol. 23:2, pp. 209-237, 2009.
http://dx.doi.org/10.1080/02699930802204677
[Solovey 2009] E. T. Solovey, A. Girouard, K. Chauncey, L. M. Hirshfield, A.
Sassaroli, F. Zheng, S. Fantini, and R. J. K. Jacob, “Using fNIRS brain sensing in
realistic HCI settings: experiments and guidelines,” in Proceedings of the 22nd
annual ACM symposium on User interface software and technology. Victoria, BC,
Canada: ACM, 2009, pp. 157-166.
http://dl.acm.org/citation.cfm?id=1622207
[Fairclough 2009] S. H. Fairclough, “Fundamentals of physiological computing,”
Interacting with Computers, vol. 21:1–2, pp. 133-145, 2009.
http://www.sciencedirect.com/science/article/pii/S0953543808000738
[DePaulo 1979] B. M. DePaulo and R. Rosenthal, “The structure of nonverbal
decoding skills1,” Journal of Personality, vol. 47:3, pp. 506-517, 1979.
http://dx.doi.org/10.1111/j.1467-6494.1979.tb00629.x
[Wensveen 2000] S. Wensveen, K. Overbeeke, and T. Djajadiningrat, “Touch me, hit
me and I know how you feel: a design approach to emotionally rich interaction,”
in Proc. of ACM Conference on Designing Interactive Systems, Brooklyn, NY, pp.
48-52, 2000.
http://portal.acm.org/citation.cfm?id=347642.347661
Wong, D., and Baker, C. 1998. Pain in Children: Comparison
of Assessment Scales, Pediatric Nurse 14(1), 1988
9017.
Note: /Theoretical Issues in Ergonomics Science, 5/(1) was a special issue for
affective computing.
Desmet, P. M. A. (2003). Measuring emotion: development and application of an
instrument to measure emotional responses to products. In M. A. Blythe, K. Overbeeke,
A. F. Monk & P. C. Wright (Eds.), Funology: from usability to enjoyment (Vol. 3, pp.
111-123). Dordrecht: Kluwer.
MEDIA
[Gaw 2006] D. Gaw, D. Morris, and K. Salisbury, “Haptically Annotated Movies:
Reaching Out and Touching the Silver Screen,” in Proc. of IEEE Symposium on
Haptic Interfaces for Virtual Environments and Teleoperator Systems (HAPTICS
2006), Washington DC, USA, pp. 287- 288, 2006.
[O'Modhrain 2004] S. O'Modhrain and I. Oakley, “Adding interactivity: active touch in
broadcast media,” in Proc. of Haptic Interfaces for Virtual Environment and
Teleoperator Systems, 2004. HAPTICS '04. Proceedings. 12th International
Symposium on, pp. 293-294, 2004.
http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=1287211
[Collins 1970] C. C. Collins, “Tactile Television - Mechanical and Electrical Image
Projection,” Man-Machine Systems, IEEE Transactions on, vol. 11:1, pp. 65-71,
1970.
http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=4081932
COLLABORATION / GUIDANCE
Y. Ishibashi and H. Kaneoka, .Fairness among game
players in networked haptic environments: In_uence of
network latency,. Proc. IEEE ICME'05, July 2005.
[3] K. Hikichi, H. Morino, I. Arimoto, I. Fukuda, S. Matsumoto,
M. Iijima, K. Sezaki, and Y. Yasuda, .Architecture
of haptics communication system for adaptation to
network environments,. Proc. IEEE ICME'01, Aug. 2001.
[4] H. Kaneoka and Y. Ishibashi, .Effects of group synchronization
control over haptic media in collaborative
work,. Proc. the 14th International Conference on Arti_cial Reality and Telexistence (ICAT'04), pp. 138-145,
Nov./Dec. 2004.
D. Feygin, M. Keehner, and F. Tendick, ìHaptic
Guidance: Experimental Evaluation of a Haptic Training
Method for a Perceptual Motor Skill,î In Proc. Symp.
Haptic Interfaces for Virtual Environment and
Teleoperator Systems, pp. 40, 2002
[19] S. Saga, N. Kawakami, and S. Tachi, ìHaptic Teaching
using Opposite Force Presentation,î In Proc. WHC, pp.
18-20, Mar. 2005
C. Teo, E. Burdet, and H. Lim, ìA Robotic Teacher of
Chinese Handwriting,î In Proc. Int. Symp. Haptic
Interfaces for Virtual Environment and Teleoperator
Systems, pp. 335-341, Mar. 2002
[16] G. Srimathveeravalli and K. Thenkurussi, ìMotor Skill
Training Assistance Using Haptic Attributes,î In Proc.
WHC, pp. 452-457, Mar. 2005
OTHER APPLICATIONS
H. Delingette and N. Ayache. Hepatic surgery simulation. Communications of the
ACM, 48(2):31–36, 2005.
HUMAN-ROBOT INTERACTION
[Arkin 2003] Ronald C. Arkin, Masahiro Fujita, Tsuyoshi Takagi, Rika Hasegawa, An
ethological and emotional basis for human-robot interaction, Robotics and Autonomous
Systems, Volume 42, Issues 3-4, 31 March 2003, Pages 191-201.
[Olsen 2003] Olsen, D., and Goodrich, M., Metrics for evaluating human-robot
interactions. In Proc. NIST Performance Metrics for Intelligent Systems Workshop 2003.
[Shibata 1997] Shibata, T.; Yoshida, M.; Yamato, J.; , "Artificial emotional creature for
human-machine interaction," Systems, Man, and Cybernetics, 1997. 'Computational
Cybernetics and Simulation'., 1997 IEEE International Conference on , vol.3, no.,
pp.2269-2274 vol.3, 12-15 Oct 1997
[Shibata 1999] Shibata, T.; Tashima, T.; Tanie, K.; , "Emergence of emotional behavior
through physical interaction between human and robot," Robotics and Automation,
1999. Proceedings. 1999 IEEE International Conference on , vol.4, no., pp.2868-2873
vol.4, 1999
[Yanco 2004] Yanco, H.A.; Drury, J.; , "Classifying human-robot interaction: an updated
taxonomy," Systems, Man and Cybernetics, 2004 IEEE International Conference on ,
vol.3, no., pp. 2841- 2846 vol.3, 10-13 Oct. 2004
[Breemen 2004] Bringing robots to life: Applying principles of animation to robots. In
Proceedings of the Workshop on Shaping Human-Robot Interaction -- Understanding
the Social Aspects of Intelligent Robotic Products. In Cooperation with the CHI2004
Conference, Vienna, Apr. 2004.
[Young 2008] James E. Young , Takeo Igarashi , Ehud Sharlin, Puppet Master:
designing reactive character behavior by demonstration, Proceedings of the 2008 ACM
SIGGRAPH/Eurographics Symposium on Computer Animation, July 07-09, 2008,
Dublin, Ireland
HUMAN-ANIMAL INTERACTION
[Rodney 1991] Challenges for Complete Creature Architectures. In Jean-Arcady Meyer
& Stewart W. Wilson (eds.), From Animals to Animats: Proceedings of the First
International Conference on Simulation of Adaptive Behavior (Complex Adaptive
Systems). Mit Press. 1991.
ATTENTION AND INTERRUPTION
Adamczyk, Bailey. 2004. If not now, when?: the effects of interuption at different
moments within task execution. CHI 2004.
http://doi.acm.org/10.1145/985692.985727
[Lee 2004] J. D. Lee, J. D. Hoffman, and E. Hayes, “Collision warning design to
mitigate driver distraction,” in Proc. of the ACM Conf on Human Factors in
Computing Systems (CHI '04), Vienna, pp. 65-72, 2004.
http://doi.acm.org/10.1145/985692.985727
[Oulasvirta 2005] A. Oulasvirta, S. Tamminen, V. Roto, and J. Kuorelahti, “Interaction
in 4-second bursts: The fragmented nature of attentional resources in mobile
HCI,” in Proc. of ACM Conference on Human Factors in Computing Systems
(CHI '05), CHI Letters, vol. 7, no. 1, pp. 919-928, 2005.
http://portal.acm.org/citation.cfm?id=1054972.1055101
Patten, C., Kircher, A., Ostlund, J., & Nilsson, L., Using mobile telephones: cognitive workload
and attention resource allocation. Accident Analysis & Prevention, 2004. 36(3): p. 341-350.
GENERAL HCI
[Bates 1992] Kantrowitz, M. and Bates, J. (1992). Natural Language Text Generation in
the Oz Interactive Fiction Project. In Dale, R., Hovy, E., Rosner, D., and Stock, O.,
editors, Aspects of Automated Natural Language Generation, Volume 587 of Lecture
Notes in Artificial Intelligence, pp. 13-28. Springer-Verlag. (This is the Proceedings of
the Sixth International Workshop on Natural Language Generation, Trento, Italy, April
1992.) Also appeared as Technical Report CMU-CS-92-107, School of Computer
Science, Carnegie Mellon University, Pittsburgh, PA, April 1992.
http://www.interaction-design.org/encyclopedia/
Handbook of UsabilityTesting: How to Plan, Design, and Conduct EffectiveTests,
Second Edition
http://library.books24x7.com.ezproxy.library.ubc.ca/toc.asp?site=DHD8T&bookid=2520
3
Measuring the User Experience: Collecting, Analyzing, and Presenting Usability Metrics
http://library.books24x7.com.ezproxy.library.ubc.ca/toc.asp?site=DHD8T&bookid=3231
4
Observing the User Experience: A Practitioner's Guide to User Research
http://library.books24x7.com.ezproxy.library.ubc.ca/toc.asp?site=DHD8T&bookid=7061
Paper Prototyping: The Fast and Easy Way to Define and Refine User Interfaces
http://library.books24x7.com.ezproxy.library.ubc.ca/toc.asp?site=DHD8T&bookid=7062
RELATION FROM PROPRIOCEPTION TO AESTHETICS
•
S. Guest, J. M. Dessirier, A. Mehrabyan, F. McGlone, G. Essick, G. Gescheider, A.
Fontana, R. Xiong, R. Ackerley, and K. Blot, “The development and validation of
sensory and emotional scales of touch perception,” Attention, Perception, &
Psychophysics, vol. 73, no. 2, pp. 531–550, 2011.
•
F. McGlone, H. Olausson, J. A. Boyle, M. Jones-Gotman, C. Dancer, S. Guest,
and G. Essick, “Touching and feeling: differences in pleasant touch processing between
glabrous and hairy skin in humans,” European Journal of Neuroscience, 2012.
•
G. K. Essick, F. McGlone, C. Dancer, D. Fabricant, Y. Ragin, N. Phillips, T. Jones,
and S. Guest, “Quantitative assessment of pleasant touch,” Neuroscience &
Biobehavioral Reviews, vol. 34, no. 2, pp. 192–203, 2010.
•
M. Jakesch and C. C. Carbon, “The Mere Exposure Effect in the Domain of
Haptics,” PloS one, vol. 7, no. 2, p. e31215, 2012.
•
R. Klatzky and J. Peck, “Please Touch: Object Properties that Invite Touch,”
Haptics, IEEE Transactions on, no. 99, pp. 1–1, 2011.
•
J. Peck and T. L. Childers, “Individual differences in haptic information processing:
The ‘need for touch’ scale,” Journal of Consumer Research, vol. 30, no. 3, pp. 430–442,
2003.
•
E. Koskinen, T. Kaaresoja, and P. Laitinen, “Feel-good touch: finding the most
pleasant tactile feedback for a mobile touch screen button,” in Proceedings of the 10th
international conference on Multimodal interfaces, 2008, pp. 297–304.
•
Y. J. Zheng and J. B. Morrell, “Haptic Actuator Design Parameters That Influence
Affect and Attention.”
THERMAL
[Jones 2009] L. Jones, “Thermal Touch,” Scholarpedia, vol. 4:5, pp. 7955, 2009.
http://www.scholarpedia.org/article/Thermal_touch
On density:
Warm and cold spots are only a few millimeters in diameter, and are distributed
independently. There are more cold spots than warm spots, and the density of spots
varies across the body. For example, on the forearm it is estimated that there are
approximately 7 cold spots and 0.24 warm spots per 100 mm2. In addition to
differences in the distribution of cold and warm thermoreceptors across the skin surface,
the two types of receptor differ with respect to the conduction velocities of the afferent
fibers that convey information from the receptor to the central nervous system. Cold
afferent fibers are myelinated and so are much faster than unmyelinated warm afferent
fibers with conduction velocities of 10-20 m/s as compared to 1-2 m/s for warm fibers.
As would be expected from these differences in conduction velocities, the time to
respond to a cold stimulus is significantly shorter than that for a warm stimulus.
UNSORTED