• 検索結果がありません。

第 9 章 結 論 89

9.2 将来の発展

研究全体の背景でも述べたように,振動触覚という物理的なオブジェクトに触れた 情報を体現した技術は携帯端末やゲームに用いられている.同様に,概念的情報を体 現する新たな振動フィードバックにおいても様々な端末に将来用いられることが十分 に考えられる.これまで,振動フィードバックはその意味を考えることなく設計され てきたが,本研究によってフィードバックを設計する側に新たな可能性を与えること が期待できる.

研究の課題としては,例として挙げた応用アプリケーションの実装とユーザへの影 響の調査である.想定できる使用例として挙げたものの,その実用性についてはまっ たく評価を行っていない.また,本研究は基本的に実験室環境でおこなったため,その アプリケーションを使用する実際の状況では同様の効果を得ることができるかは調査 する必要がある.そして,複数のアクチュエータを用いた振動パターンの作成である.

本研究では単一の振動子を強弱を時系列で表現することで振動パターンとした.しか し,複数のアクチュエータを用いて時系列または同時に作動させることでパターンを 作り出す研究が行われている[132, 146].そこで,本研究でも振動子の数を増やすこと で,これまでに出来なかった概念的情報の体現も行うことができると考える.

謝 辞

本研究を行うにあたり,山梨大学工学部 木下雄一朗准教授には研究の機会を賜り,

数々の貴重な御指導,御助言をいただきました.まだ知識が至らない私に,HCI分野 における多くの知見を示していただきました.また研究内容のみならず,修了後の進 路や学生生活についても多くの御助言をいただきました.木下准教授の御指導のもと,

研究を行えましたことを感謝申し上げます.

山梨大学工学部 郷健太郎教授には,本研究を進めるにあたり様々な場面において丁 寧で貴重な御助言をいただいたことを心より感謝いたします.研究の方向性を模索し ていた折,郷教授からの御助言によって研究を進めることができました.郷教授の御 協力のもと,研究を行えましたことを重ねて感謝申し上げます.

本研究の審査を務めていただきました,山梨大学工学部 茅暁陽教授,小澤賢司教授,

小俣昌樹准教授,北村敏也准教授には,研究発表や研究へのご助力,日々の講義など 様々な場面において研究の内容について貴重なご助言をいただいたことを心より感謝 致します.

郷研究室の皆様には,木下研究室との全体ゼミや各発表会の準備など,研究につい て多くの御意見をいただきました.研究の様々な場面において御協力いただいたこと を深く感謝申し上げます.

研究全般にわたってご協力いただいた木下研究室の皆様に深く感謝申し上げます.先 輩である,鈴木圭佑様と中野友文様には,研究を進めるにあたり数々の貴重な御助言 をいただきました. また,研究室での生活においても様々な場面でご協力頂き心より 感謝申し上げます.また,同輩である鈴木拓海様と甲賀まゆみ様,後輩である坂本凌 様,岡洋介様,小宮山憂様,増田愛美様,藤原耕平様,塚田悠斗様,戸塚敬様,西尾昌 人様,一瀬幸子様,金高遥様,萩原寛也様におかれましては,それぞれの進捗状況や 研究に対する姿勢から,これまでにない新しい刺激を受けることができました.研究 を最後まで続けることができたのも木下研究室の皆様のおかげだと感じています.心 より御礼申し上げます.また,多忙な時期に実験の協力者としてご協力頂いた方々に 心より感謝申し上げます.

本研究はもとより,今日に至るまで私を育て,大学生活の様々な場面において支え てくださった両親,家族に深く感謝いたします.

最後に,本研究は多くの方々の御指導,御協力によって最後までやり遂げることが できました.皆様に改めて深く感謝申し上げます.

参考文献

[1] Cauchard, J. R., Cheng, J. L., Pietrzak , T., and Landay, J. A.: ActiVibe:

Design and Evaluation of Vibrations for Progress Monitoring; Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 3261–3271 (2016).

[2] Hwang, J. and Hwang, W.: Vibration Perception and Excitatory Direction for Haptic Devices; Journal of Intelligent Manufacturing, Vol. 22, No. 1, pp. 17–27 (2011).

[3] Karuei, I., MacLean, K. E., Foley-Fisher, Z., MacKenzie, R., Koch, S., and El-Zohairy, M.: Detecting Vibrations across the Body in Mobile Contexts; Pro-ceedings of the 2011 CHI Conference on Human Factors in Computing Systems, pp. 3267–3276 (2011).

[4] Kosmalla, F., Wiehr, F., Daiber, F., Kr¨uger, A., and L¨ochtefeld, M.:

ClimbAware: Investigating Perception and Acceptance of Wearables in Rock Climbing; Proceedings of the 2016 CHI Conference on Human Factors in Com-puting Systems, pp. 1097–1108 (2016).

[5] Roumen, T., Perrault, S. T. and Zhao, S.: NotiRing: A Comparative Study of Notification Channels for Wearable Interactive Rings; Proceedings of the 2015 CHI Conference on Human Factors in Computing Systems, pp. 2497–2500 (2015).

[6] Ryu, J., Jung, J. and Choi, S.: Perceived Magnitudes of Vibrations Transmitted through Mobile Device; Proceedings of the International Symposium on Hap-tic Interfaces for Virtual Environment and Teleoperator Systems, pp. 139–140 (2008).

[7] Saket, B., Parasojo, C., Huang, Y. and Zhao, S.: Designing an Effective Vibration-based Notification Interface for Mobile Phones;Proceedings of the 2013 Conference on Computer Supported Cooperative Work, pp. 1494–1504 (2013).

[8] Brewster, S. and Brown, L. M.: Tactons: Structured Tactile Messages for Non-visual Information Display; Proceedings of the Fifth Conference on Australasian User Interface, pp. 15–23 (2004).

[9] Qian, H., Kuber, R. and Sears, A.: Towards Developing Perceivable Tactile Feedback for Mobile Devices;International Journal of Human-Computer Studies, Vol. 69, No. 11, pp. 705–719 (2011).

[10] Shneiderman, B.: Designing the User Interface; Strategies for Effective Human-Computer Interaction (3rd ed.), Addison-Wesley Longman Publishing Co., Inc.

(1997).

[11] Oakley, I., McGee, M. R., Brewster, S. and Gray, P.: Putting the Feel in ‘Look and Feel’; Proceedings of the 2000 CHI Conference on Human Factors in Com-puting Systems, pp. 415–422 (2000).

[12] Ando, H., Watanabe, J., Inami, M., Sugimito, M. and Maeda, T.: A Fingernail-Mounted Tactile Display for Augmented Reality Systems; Electronics and Com-munications in Japan (Part 2), Vol. 90, No. 4, pp. 56–65 (2007).

[13] Fukumoto, M. and Sugimura, T.: Active Click: Tactile Feedback for Touch Panels;CHI 2001 Extended Abstracts on Human Factors in Computing Systems, pp.121–122 (2001).

[14] Kim, S. and Lee, G.: Haptic Feedback Design for a Virtual Button along Force-displacement Curves; Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology, pp.91–96 (2013).

[15] 任 天 堂 株 式 会 社: 学 研 さ ん と「HD 振 動 」の ヒ ミ ツ に つ い て 調 べ て み ま し た, https://topics.nintendo.co.jp/c/article/73d9531a-6bbe-11e7-8cda-063b7ac45a6d.html (2017–8–10).

[16] Kinoshita, Y., Tsukanaka, S. and Go, K.: Strolling with Street Atmosphere Visualization: Development of a Tourist Support System;Proceedings of the 2013 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pp. 553–558 (2013).

[17] Boy, J., Pandey, A. V., Emerson, J., Satterthwaite, M., Nov, O. and Bertini, E.:

Showing People Behind Data: Does Anthropomorphizing Visualizations Elicit More Empathy for Human Rights Data?; Proceedings of the 2017 CHI Confer-ence on Human Factors in Computing Systems, pp. 5462–5474 (2017).

[18] Ishii, H. and Ullmer, B.: Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms; Proceedings of the 1997 CHI Conference on Human Factors in Computing Systems, pp. 234–241 (1997).

[19] Gutwin, C., Cockburn, A. and Coveney, A.: Peripheral Popout: The Influence of Visual Angle and Stimulus Intensity on Popout Effects; Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 208–219 (2017).

[20] Healey, C. and Enns, J.: Attention and Visual Memory in Visualization and Computer Graphics;IEEE Transactions on Visualization and Computer Graph-ics, Vol. 18, No. 7, pp. 1170–1188 (2012).

[21] Ngo, D. C. L., Teo, L. and Byrne, J.: Formalising Guidelines for the Design of Screen Layouts; Displays, Vol. 21, No. 1, pp. 3–15 (2000).

[22] Ngo, D. C. L., Teo, L. and Byrne, J.: Modelling Interface Aesthetics;Information Sciences, Vol. 152, No. 1, pp. 25–46 (2003).

[23] Serrano, M, Roudaut, A. and Irani, P.: Investigating Text Legibility on Non-Rectangular Displays; Proceedings of the 2016 CHI Conference on Human Fac-tors in Computing Systems, pp. 498–508 (2016).

[24] Serrano, M., Roudaut, A. and Irani, P.: Visual Composition of Graphical El-ements on Non-Rectangular Displays; Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 4405–4416 (2017).

[25] Angeli, A. D., Sutcliffe, A. and Hartmann, J.: Interaction, Usability and Aes-thetics: What Influences Users’ Preferences?; Proceedings of the 2006 DIS Con-ference on Designing Interactive Systems, pp. 271–280 (2006).

[26] Kurosu, M. and Kashimura, K.: Apparent Usability vs. Inherent Usability: Ex-perimental Analysis on the Determinants of the Apparent Usability; Proceedings of the 6th Conference on Designing Interactive Systems, pp. 292–293 (1995).

[27] Salimun, C., Purchase, H. C., Simmons, D. R. and Brewster, S.: The Effect of Aesthetically Pleasing Composition on Visual Search Performance; Proceed-ings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries, pp. 422–431 (2010).

[28] Sonderegger, A. and Sauer, J.: The Influence of Design Aesthetics in Usabil-ity Testing: Effects on User Performance and Perceived UsabilUsabil-ity; Applied Er-gonomics, Vol. 41, No. 3, pp. 403–410 (2010).

[29] Cheng, L., Ofek, E., Holz, C., Benko, H. and Wilson, A. D.: Sparse Haptic Proxy: Touch Feedback in Virtual Environments Using a General Passive Prop;

Proceedings of the 2017 CHI Conference on Human Factors in Computing Sys-tems, pp. 3718–3728 (2017).

[30] Hanamitsu, N. and Israr, A.: Haplug: A Haptic Plug for Dynamic VR Inter-actions; Haptic Interaction. AsiaHaptics 2016. Lecture Notes in Electrical Engi-neering, Vol. 432, pp. 479–483 (2016).

[31] Kim, H., Kim, M. and Lee, W.: HapThimble: A Wearable Haptic Device towards Usable Virtual Touch Screen;Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 3694–3705 (2016).

[32] Peiris, R. L., Peng, W., Chen, Z., Chan, L. and Minamizawa, K.: ThermoVR:

Exploring Integrated Thermal Haptic Feedback with Head Mounted Displays;

Proceedings of the 2017 CHI Conference on Human Factors in Computing Sys-tems, pp. 5452–5456 (2017).

[33] Schorr, S. B. and Okamura, A. M.: Fingertip Tactile Devices for Virtual Ob-ject Manipulation and Exploration; Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 3115–3119 (2017).

[34] Dasgupta, A., Burrows, S., Han, K. and Rasch, P. J.: Empirical Analysis of the Subjective Impressions and Objective Measures of Domain Scientists’ Visual Analytic Judgments; Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 1193–1204 (2017).

[35] Greis, M., Avci, E., Schmidt, A. and Machulla, T.: Increasing Users’ Confidence in Uncertain Data by Aggregating Data from Multiple Sources; Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 828–840 (2017).

[36] Wilson, G., Davidson, G. and Brewster, S. A.: In the Heat of the Moment:

Subjective Interpretations of Thermal Feedback During Interaction; Proceedings of the 2015 CHI Conference on Human Factors in Computing Systems, pp. 2063–

2072 (2015).

[37] Wilson, G., Halvey, M., Brewster, S. A. and Hughes, S. A.: Some Like It Hot:

Thermal Feedback for Mobile Devices; Proceedings of the 2011 CHI Conference on Human Factors in Computing Systems, pp. 2555–2564 (2011).

[38] Tewell, J., Bird, J. and Buchanan, G. R.: The Heat is On: A Temperature Dis-play for Conveying Affective Feedback; Proceedings of the 2017 CHI Conference

[39] Tewell, J., Bird, J. and Buchanan, G. R.: Heat-Nav: Using Temperature Changes as Navigation Cues;Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 1131–1135 (2017).

[40] Park, Y. and Nam, T.: POKE: a New Way of Sharing Emotional Touches dur-ing Phone Conversations; Proceedings of the 2013 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pp. 2859–2860 (2013).

[41] Jang, S., Kim, L. H., Tanner, K., Ishii, H. and Follmer, S.: Haptic Edge Dis-play for Mobile Tactile Interaction; Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 3706–3716 (2016).

[42] Ion, A., Wang, E. J. and Baudisch, P.: Skin Drag Displays: Dragging a Physical Tactor across the User’s Skin Produces a Stronger Tactile Stimulus than Vibro-tactile;Proceedings of the 2015 CHI Conference on Human Factors in Computing Systems, pp. 2501–2504 (2015).

[43] Je, S., Rooney, B., Chan, L. and Bianchi, A.: tactoRing: A Skin-Drag Dis-crete Display; Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 3106–3114 (2017).

[44] Li, K. A., Baudisch, P., Griswold, W. G. and Hollan, J. D.: Tapping and Rub-bing: Exploring New Dimensions of Tactile Feedback with Voice Coil Motors;

Proceedings of the 21st Annual ACM Symposium on User Interface Software and Technology, pp. 181–190 (2008).

[45] Strasnick, E. Cauchard, J. R. and Landay, J. A.: BrushTouch: Exploring an Alternative Tactile Method for Wearable Haptics; Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 3120–3125 (2017).

[46] Song, S., Noh, G., Yoo, J., Oakley, I., Cho, J. and Bianchi, A.: Hot & Tight: Ex-ploring Thermo and Squeeze Cues Recognition on Wrist Wearables; Proceedings of the 2015 ACM International Symposium on Wearable Computers, pp. 39–42 (2015).

[47] Stanley, A. A. and Kuchenbecker, K. J.: Evaluation of Tactile Feedback Methods for WristRotation Guidance; IEEE Transactions on Haptics, Vol. 5, No. 3, pp.

240–251 (2012).

[48] He, L., Xu, C., Xu, D. and Brill, R.: PneuHaptic: Delivering Haptic Cues with a Pneumatic Armband; Proceedings of the 2015 ACM International Symposium on Wearable Computers, pp. 47–48 (2015).

[49] Pohl, H., Brandes, P., Quang, H. N. and Rohs, M.: Squeezeback: Pneumatic Compression for Notifications; Proceedings of the 2017 CHI Conference on Hu-man Factors in Computing Systems, pp. 5318–5330 (2017).

[50] Chinello, F., Aurilio, M., Pacchierotti, C. and Prattichizzo, D.: The HapBand:

A Cutaneous Device for Remote Tactile Interaction; Proceedings of EuroHaptics, pp. 284–291 (2014).

[51] Suhonen, K., V¨a¨an¨anen-Vainio-Mattila, K. and M¨akel¨a, K.: User Experiences and Expectations of Vibrotactile, Thermal and Squeeze Feedback in Interper-sonal Communication;Proceedings of the 26th Annual BCS Interaction Specialist Group Conference on People and Computers, pp. 205–214 (2012).

[52] Suzuki, Y., Kobayashi, M. and Ishibashi, S.: Design of Force Feedback Utiliz-ing Air Pressure toward Untethered Human Interface; Proceedings of the 2002 CHI Extended Abstracts on Human Factors in Computing Systems, pp. 808–809 (2002).

[53] Gupta, S., Morris, D., Patel, S. N. and Tan, D.: AirWave: Non-Contact Haptic Feedback using Air Vortex Rings; Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing, pp. 419–428 (2013).

[54] Sodhi, R., Poupyrev, I., Glisson, M. and Israr, A.: AIREAL: Interactive Tactile Experiences in Free Air; ACM Transactions on Graphics, Vol. 32, No. 4, Article No. 134, 10 pages (2013).

[55] Sato, Y. and Ueoka, R.: Investigating Haptic Perception of and Physiological Responses to Air Vortex Rings on a User’s Cheek; Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 3083–3094 (2017).

[56] Hachisu, T. and Fukumoto, M.: VacuumTouch: Attractive Force Feedback In-terface for Haptic Interactive Surface using Air Suction; Proceedings of the 2014 CHI Conference on Human Factors in Computing Systems, pp. 411–420 (2014).

[57] Hoshi, T., Takahashi, M., Iwamoto, T. and Shinoda, H.: Noncontact Tactile Dis-play Based on Radiation Pressure of Airborne Ultrasound; IEEE Transactions on Haptics, Vol. 3, No. 3, pp. 155–165 (2010).

[58] Obrist, M., Seah, S. A. and Subramanian, S.: Talking about Tactile Experiences;

Proceedings of the 2013 CHI Conference on Human Factors in Computing Sys-tems, pp. 1659–1668 (2013).

[59] Wilson, G., Carter, T., Subramanian, S. and Brewster, S. A.: Perception of Ultrasonic Haptic Feedback on the Hand: Localisation and Apparent Motion;

Proceedings of the 2014 CHI Conference on Human Factors in Computing Sys-tems, pp. 1133–1142 (2014).

[60] Makino, Y., Furuyama, Y., Inoue, S. and Shinoda, H.: HaptoClone (Haptic-Optical Clone) for Mutual Tele-Environment by Real-time 3D Image Transfer with Midair Force Feedback; Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 1980–1990 (2016).

[61] Yem, V., Okazaki, R. and Kajimoto, H.: FinGAR: Combination of Electrical and Mechanical Stimulation for High-Fidelity Tactile Presentation; ACM SIG-GRAPH 2016 Emerging Technologies, Article 7, 2 pages (2016).

[62] Khurelbaatar, S., Nakai, Y., Okazaki, R., Yem, V. and Kajimoto, H.: Tactile Presentation to the Back of a Smartphone with Simultaneous Screen Opera-tion; Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 3717–3721 (2016).

[63] Bau, O., Poupyrev, I., Israr, A. and Harrison, C.: TeslaTouch: Electrovibration for touch surfaces; Proceedings of the 23nd Annual ACM Symposium on User Interface Software and Technology, pp. 283–292 (2010).

[64] Bau, O., Poupyrev, I., Goc, M. L., Galliot, L. and Glisson, M.: REVEL: Tactile Feedback Technology for Augmented Reality; ACM SIGGRAPH 2012 Emerging Technologies, Article 17, 1 page (2012).

[65] Wang, Q., Ren, X. and Sun, X.: Enhancing Pen-Based Interaction using Elec-trovibration and Vibration Haptic Feedback; Proceedings of the 2017 CHI Con-ference on Human Factors in Computing Systems, pp. 3746–3750 (2017).

[66] Spelmezan, D., Sahoo, D. R. and Subramanian, S.: Sparkle: Towards Haptic Hover-Feedback with Electric Arcs; Proceedings of the 29th Annual Symposium on User Interface Software and Technology, pp. 55–57 (2016).

[67] Elliot, A. J., Kayser, D. N., Greitemeyer, T., Lichtenfeld, S., Gramzow, R.

H., Maier, M. A. and Liu, H.: Red, Rank, and Romance in Women Viewing Men;Journal of Experimental Psychology: General, Vol. 139, No. 3, pp. 339–417 (2010).

[68] Elliot, A. J., Maier, M. A., Moller, A. C., Friedman, R. and Meinhardt, J.:

Color and Psychological Functioning: the Effect of Red on Performance Attain-ment; Journal of Experimental Psychology: General, Vol. 136, No. 1, pp. 154–168 (2007).

[69] Frank, M. G., Gilovich, T.: The Dark Side of Self- and Social Perception: Black Uniforms and Aggression in Professional Sports; Journal of Personality and So-cial Psychology, Vol. 54, No. 1, pp. 74–85 (1988).

[70] Labrecque, L. and Milne, G.: Exciting Red and Competent Blue: The Impor-tance of Color in Marketing; Journal of the Academy of Marketing Science, Vol.

40, No. 5, pp. 711–727 (2012).

[71] Crowley, A. E.: The Two-Dimensional Impact of Color on Shopping; Marketing Letters, Vol. 4, No. 1, pp. 59–69 (1993).

[72] Bagchi, R. and Cheema, A.: The Effect of Red Background Color on Willingness-to-Pay: The Moderating Role of Selling Mechanism; Journal of Consumer Re-search, Vol. 39, No. 5, pp. 947–960 (2013).

[73] Bartram, L., Patra, A. and Stone, M.: Affective Color in Visualization; Proceed-ings of the 2017 CHI Conference on Human Factors in Computing Systems, pp.

1364–1374 (2017).

[74] Valdez, P. and Mehrabian, A.: Effects of Color on Emotions; Journal of Experi-mental Psychology: General, Vol. 123, No. 4, pp. 394–409 (1994).

[75] Gorn, G. J., Chattopadhyay, A., Sengupta, J. and Tripathi, S.: Waiting for the Web: How Screen Color Affects Time Perception;Journal of Marketing Research, Vol. 41, No. 2, pp. 215–225 (2014).

[76] Mathew, D.: vSmileys: Imaging Emotions through Vibration Patterns; Alterna-tive Access: Feelings and Games 2005, pp. 75–80 (2005).

[77] Seifi, H., Zhang, K. and MacLean, K. E.: VibViz: Organizing, Visualizing and Navigating Vibration Libraries; Proceedings of the 2015 IEEE World Haptics Conference, pp. 254–259 (2015).

[78] Seifi, H. and Maclean, K. E.: A First Look at Individuals’ Affective Ratings of Vibrations; Proceedings of the 2013 IEEE World Haptics Conference, pp. 605–

610 (2013).

[79] Yoo, Y., Yoo, T., Kong, J. and Choi, S.: Emotional Responses of Tactile Icons:

Effects of Amplitude, Frequency, Duration, and Envelope; Proceedings of the 2015 IEEE World Haptics Conference, pp. 235–240 (2015).

[80] Wobbrock, J. O., Morris, M. R. and Wilson, A. D.: User-Defined Gestures for Surface Computing; Proceedings of the 2009 CHI Conference on Human Factors

[81] Pan¨eel, S., Anastassova, M. and Brunet, L.: TactiPEd: Easy Prototyping of Tactile Patterns; IFIP Conference on Human-Computer Interaction, pp. 228–

245 (2013).

[82] Schneider, O. S. and MacLean, K. E.: Studying Design Process and Example Use with Macaron, A Web-based Vibrotactile Effect Editor; 2016 IEEE Haptic Symposium, pp. 52–58 (2016).

[83] 飛田 良文, 浅田 秀子: 現代形容詞用法辞典; 東京堂出版 (2008).

[84] Likert, R.: A Technique for the Measurement of Attitudes; Archives of Psychol-ogy, Vol. 22, No140, 55 pages (1932).

[85] Osgood, C. E., Suci, G. J. and Tennenbaum, P. H.: The Measurement of Mean-ing; University of Illinois Press, Champaign, IL (1957).

[86] Kaiser, H. F.: Varimax Solution for Primary Mental Abilities; Psychometrika, Vol.25, No.2, pp.153–158 (1960).

[87] Ward, J. H.: Hierarchical Grouping to Optimize an Objective Function; J. Am.

Stat. Assoc., Vol.58, No.301, pp.236–244 (1963).

[88] Akshita, Sampath, H. A., Indurkhya, B., Lee, E. and Bae, Y.: Towards Mul-timodal Affective Feedback: Interaction between Visual and Haptic Modalities;

Proceedings of the 2015 CHI Conference on Human Factors in Computing Sys-tems, pp. 2043–2052 (2015).

[89] Lang, P. J., Bradley, M. M. and Cuthbert, & B. N.: International Affective Picture System (IAPS): Affective Ratings of Pictures and Instruction Manual (Rep. No. A-8). Technical Report A-8 (2008).

[90] L¨ochtefeld, M., Lautemann, N., Gehring, S. and Kr¨uger, A.: AmbiPad: En-riching Mobile Digital Media with Ambient Feedback; Proceedings of the 16th International Conference on Human-Computer Interaction with Mobile Devices and Services, pp. 295–298 (2014).

[91] ユ ナ イ テッド · シ ネ マ: 新 次 元 の 4D ア ト ラ ク ション シ ア タ ー,

http://www.unitedcinemas.jp/4dx/ (2017–12–4).

[92] Colley, A., Thebault-Spieker, J., Lin, A. Y., Degraen, D., Fischman, B., H¨akkil¨a, J., Kuehl, K., Nisi, V., Nunes, N. J., Wenig, N., Wenig, D., Hecht, B. and Sch¨oning, J.: The Geography of Pok´emon GO: Beneficial and Problematic Ef-fects on Places and Movement; Proceedings of the 2017 CHI Conference on Hu-man Factors in Computing Systems, pp. 1179–1192 (2017).

[93] Paavilainen, J., Korhonen, H., Alha, K., Stenros, J., Koskinen, E. and Mayra, F.: The Pok´emon GO Experience: A Location-Based Augmented Reality Mobile Game Goes Mainstream; Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 2493–2498 (2017).

[94] Wilson, G and Brewster, S. A.: Multi-moji: Combining Thermal, Vibrotactile

& Visual Stimuli to Expand the Affective Range of Feedback; Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 1743–1755 (2017)

[95] Wilson, G., Romeo, P. and Brewster, S. A.: Mapping Abstract Visual Feedback to a Dimensional Model of Emotion; Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pp. 1779–1787 (2016).

[96] Wilson, G., Dobrev, D. and Brewster, S. A.: Hot Under the Collar: Map-ping Thermal Feedback to Dimensional Models of Emotion; Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 4838–4849 (2016).

[97] Halvey, M., Henderson, M., Brewster, S. A., Wilson, G. and Hughes, S. A.:

Augmenting Media with Thermal Stimulation; Proceedings of th International Conference on Haptic and Audio Interaction Design, pp. 91–100 (2012).

[98] Akazue, M., Halvey, M., Baillie, L. and Brewster, S.: The Effect of Thermal Stimuli on the Emotional Perception of Images; Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 4401–4412 (2016).

[99] Salminen, K., Surakka, V., Raisamo, J., Lylykangas, J., Raisamo, R., M´’ akel´’

a, K. and Ahmaniemi, T.: Cold or Hot? How Thermal Stimuli Are Related to Human Emotional System?; Revised Selected Papers of the 8th International Workshop on Haptic and Audio Interaction Design, pp. 20–29 (2013).

[100] Salminen, K., Surakka, V., Raisamo, J., Lylykangas, J., Raisamo, R., M´’ akel´’ a, K. and Ahmaniemi, T.: Emotional Responses to Thermal Stimuli; Proceedings of the 13th International Conference on Multimodal Interfaces, pp. 193–196 (2011).

[101] Mothersill, P. and Bove, V. M. Jr.: The EmotiveModeler: an Emotive form Design CAD Tool; Proceedings of the 2015 CHI Conference on Human Factors in Computing Systems, pp. 339–342 (2015).

[102] Lee, J. M., Jeong, S. Y. and Ju, D. Y.: Emotional Interaction and Nofitication of Flexible Handheld Devices; Proceedings of the 2015 CHI Conference Extended

[103] Strohmeier, P., Carrascal, J. P., Cheng, B., Meban, M. and Vertegaal, R.:

An Evaluation of Shape Changes for Conveying Emotions; Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 3781–3792 (2016). .

[104] Suhonen, K., M¨uller, S., Rantala, J., V¨a¨an¨anen-Vainio-Mattila, K., Raisamo, R.

and Lantz, V.: Haptically Augmented Remote Speech Communication: a Study of User Practices and Experiences; Proceedings of the 7th Nordic Conference on Human-Computer Interaction: Making Sense Through Design, pp. 361–369 (2012).

[105] Smith, J. and MacLean, K.: Communicating Emotion through a Haptic Link:

Design Space and Methodology; International Journal of Human-Computer Studies, pp. 376–387 (2007).

[106] Hassib, M., Pfeiffer, M., Schneegass, S., Rohs, M. and Alt, F.: Emotion Actuator:

Embodied Emotional Feedback through Electroencephalography and Electrical Muscle Stimulation;Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 6133–6146 (2017).

[107] Russell, J. A.: A Circumplex Model of Affect;Journal of Personality and Social Psychology, Vol. 39, No. 6. pp. 1161–1178 (1980).

[108] 濱 治世,鈴木 直人,濱 保久: 感情心理学への招待—感情·情緒へのアプローチ—;

株式会社 サイエンス社 (2001).

[109] Taki, R., Maeda, Y. and Takahashi, Y.: Effective Emotional Model of Pet-Type Robot in Interactive Emotion Communication; SCIS & ISIS, pp. 199–204 (2010).

[110] Pedersen, E. W., Subramanian, S. and Hornbæk, K.: Is My Phone Alive?: a Large-Sscale Study of Shape Change in Handheld Devices using Videos; Pro-ceedings of the 2014 CHI Conference on Human Factors in Computing Systems, pp. 2579–2588 (2014).

[111] Hemmert, F., L¨owe, M., Wohlauf, A. and Joost, G.: Animate Mobiles: Prox-emically Reactive Posture Actuation as a Means of Relational Interaction with Mobile Phones; Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction, pp. 267–270 (2013).

[112] Yohanan, S. and MacLean, K. E.: Design and Assessment of the Haptic Crea-ture’s Affect Display; Proceedings of the 6th International Conference on Human-Robot Interaction, pp. 473–480 (2011).

[113] Bucci, P., Cang, L. X., Valair, A., Marino, D., Tseng, L., Jung, M., Rantala, J., Schneider, O. S. and MacLean, K. E.: Sketching CuddleBits: Coupled Pro-totyping of Body and Behaviour for an Affective Robot Pet; Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 3681–3692 (2017).

[114] Salminen, K., Surakka, V., Lylykangas, J., Rantala, J., Ahmaniemi, T., Raisamo, R., Trendafilov, D. and Kildal, J.: Tactile Modulation of Emotional Speech Sam-ples; Advances in Human-Computer Interaction, Article No. 17, 1 page (2012).

[115] Fitts, P. M.: The Information Capacity of the Human Motor System in Control-ling the Amplitude of Movement; Journal of Experimental Psychology, Vol. 47, No. 6, pp. 381–391 (1954).

[116] Bi, X., Li, Y. and Zhai, S.: FFitts Law: Modeling Finger Touch with Fitts’

Law; Proceedings of the 2013 CHI Conference on Human Factors in Computing Systems, pp. 1363–1372 (2013).

[117] Avrahami, D.: The Effect of Edge Targets on Touch Performance;Proceedings of the 2015 CHI Conference on Human Factors in Computing Systems, pp. 1837–

1846 (2015).

[118] Holz, C. and Baudisch, P.: The Generalized Perceived Input Point Model and How to Double Touch Accuracy by Extracting Fingerprints; Proceedings of the 2010 CHI Conference on Human Factors in Computing Systems, pp. 581–590 (2010).

[119] Lee, S. and Zhai, S.: The Performance of Touch Screen Soft Buttons;Proceedings of the 2009 CHI Conference on Human Factors in Computing Systems, pp. 309–

318 (2009).

[120] ISO 9241-9:2000, Ergonomic Requirements for Office Work with Visual Display Terminals (VDTs) — Part 9: Requirements for Non-Keyboard Input Devices, ISO 2000.

[121] Buschek, D., Luca, A. D. and Alt, F.: Evaluating the Influence of Targets and Hand Postures on Touch-based Behavioural Biometrics; Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 1349–1361 (2016).

[122] Hart, S. G. and Staveland, L. E.: Development of NASA-TLX (Task Load In-dex): Results of Empirical and Theoretical Research.;Human Mental Workload,

[123] 三宅晋司: メンタルワークロードの計測と解析—NASA-TLX再考—; 人間工学, Vol. 51, No. 6, pp. 391–398 (2015).

[124] Delazio, A., Israr, A. and Klatzky, R. L.: Cross-Modal Correspondence between Vibrations and Colors; Proceedings of the 2013 IEEE World Haptics Conference, pp. 219–224 (2017).

[125] Kaul, O. B. and Rohs, M.: HapticHead: A Spherical Vibrotactile Grid around the Head for 3D Guidance in Virtual and Augmented Reality; Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 3729–3740 (2017).

[126] Lee, J., Kwon, J. and Kim, H.: Reducing Distraction of Smartwatch Users with Deep Learning; Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct, pp. 948–953 (2016).

[127] Schloss, K. B., Strauss, E. D. and Palmer, S.: Object Color Preferences; Color Research & Application, No. 38, No. 6, pp. 393–411 (2012).

[128] Kauppinen‐ R´’ ais´’ anen, H. and Luomala, H. T.: Exploring Consumers’ Prod-uct‐ Specific Colour Meanings; Qualitative Market Research: An International Journal, Vol. 13, No. 3, pp. 287–308 (2010).

[129] Chen, X. A., Grossman, T., Wigdor, D. J. and Fitzmaurice, G.: Duet: Exploring Joint Interactions on a Smart Phone and a Smart Watch; Proceedings of the 2014 CHI Conference on Human Factors in Computing Systems, pp. 159–168 (2014).

[130] Lucero, A., Ker¨anen, J. and Korhonen, H.: Collaborative Use of Mobile Phones for Brainstorming; Proceedings of the 12th International Conference on Human Computer Interaction with Mobile Devices and Services, pp. 337–340 (2010).

[131] McCallum, D. C. and Irani, P.: ARC-Pad: Absolute+Relative Cursor Position-ing for Large Displays with a Mobile Touchscreen; Proceedings of the 22rd An-nual ACM symposium on User Interface Software and Technology, pp. 153–156 (2009).

[132] Alvina, J., Zhao, J., Perrault, S. T., Azh, M., Roumen, T. and Fjeld, M.: Om-niVib: Towards Cross-Body Spatiotemporal Vibrotactile Notifications for Mobile Phones; Proceedings of the 2015 CHI Conference on Human Factors in Comput-ing Systems, pp. 2487–2496 (2015).

[133] Kaaresoja, T. and Linjama, J.: Perception of Short Tactile Pulses Generated by a Vibration Motor in a Mobile Phone; Proceedings of the First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, pp. 471–472 (2005).

[134] Wiese, J., Saponas, T. S. and Brush, A. J. B.: Phoneprioception: Enabling Mobile Phones to Infer Where They are Kept; Proceedings of the 2013 CHI Conference on Human Factors in Computing Systems, pp. 2157–2166 (2013).

[135] Exler, A., Dinse, C., G´’ unes, Z., Hammoud, N., Mattes, S. and Beigl, M.: Inves-tigating the Perceptibility Different Notification Types on Smartphones Depend-ing on the Smartphone Position: Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers, pp. 970–976 (2017).

[136] Ko, M., Choi, S., Yatani, K. and Lee, U.: Lock n’ LoL: Group-Based Limiting Assistance App to Mitigate Smartphone Distractions in Group Activities; Pro-ceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 998–1010 (2016).

[137] Kushlev, K., Proulx, J. and Dunn, E. W.: “Silence Your Phones”: Smartphone Notifications Increase Inattention and Hyperactivity Symptoms; Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 1011–

1020 (2016).

[138] Mehrotra, A., Pejovic, V., Vermeulen, J., Hendley, R. and Musolesi, M.: My Phone and Me: Understanding People’s Receptivity to Mobile Notifications;

Proceedings of the 2016 CHI Conference on Human Factors in Computing Sys-tems, pp. 1021–1032 (2016).

[139] Blum, J. R., Frissen, I. and Cooperstock, J. R.: Improving Haptic Feedback on Wearable Devices through Accelerometer Measurements; Proceedings of the 28th Annual ACM Symposium on User Interface Software and Technology, pp. 31–36 (2015).

[140] Kubo, Y., Takada, R., Shizuki, B. and Takahashi, S.: SynCro: Context-Aware User Interface System for Smartphone-Smartwatch Cross-Device Interaction;

Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pp. 1794–1801 (2017).

[141] Kubo, Y., Takada, R., Shizuki, B. and Takahashi, S.: Exploring Context-Aware User Interfaces for Smartphone-Smartwatch Cross-Device Interaction; Proceed-ings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, Vol. 1, No. 3, Article No. 69, 21 pages (2017).

[142] Karani, S. and Varanasi, C.: Uniformity Based Haptic Alert Network; Proceed-ings of the 29th Annual Symposium on User Interface Software and Technology, pp. 201–202 (2016).

[143] Nagamachi, M.: Kansei Engineering as a Powerful Consumer-Oriented Technol-ogy for Product Development; Applied Ergonomics, Vol. 33, No. 3, pp. 289–294 (2002).

[144] Barnes, C. J., Childs, T. H. C., Henson, B. and Southee, C. H.: Surface Finish and Touch — A Case Study in a New Human Factors Tribology; Wear, Vol. 257, No. 7-8, pp. 740–750 (2004).

[145] Hsiao, S. and Huang, H. C.: A Neural Network based Approach for Product form Design; Design Studies, Vol. 23, No. 1, pp. 67–84 (2002).

[146] Lee, J., Han, J. and Lee, G.: Investigating the Information Transfer Efficiency of a 3×3 Watch-Back Tactile Display; Proceedings of the 2015 CHI Conference on Human Factors in Computing Systems, pp. 1229–1232 (2015).

付 録 A  振動パターン収集実験で用い た資料と実験データ

本論文3章の振動パターン収集実験において,実験協力者に提示した実験教示書を 図1に示す.そして,図2–31に各実験協力者が各印象語に対して生成した振動パター ンを示す.最後に,表1に各実験協力者が振動パターンを生成した際のコメント一覧 を示す.

1

振動パターン導出実験

1. 実験概要

実験にご協力いただきありがとうございます.この実験は,振動パターン生成システム を用いて,振動パターンを収集することを目的とする実験です.実験協力者の方には感性 語を提示したうえで,その印象に合致した振動を作成していただきます.

2. 実験準備

振動パターン生成システムは,PC 上で振動のパラメータを調整し,図 1 の振動デバイス を通してその都度で振動を確認できるシステムです.実験の前段階として,振動パターン 生成システムの操作方法を 1 分間程度のビデオにて説明させていただきます.実験開始直 前にシステムに慣れるための時間を 3 分間設けますので,この段階ですべての操作を理解 する必要はありません.

※振動デバイスは利き手と逆の手で保持してください(右利きの場合,左手で保持)

※振動デバイスは肌色の面を上にした状態で保持してください.

※画面右上の黄色の【 】で囲まれた数字は実験協力者番号なので実験に関係はありません.

図 1: 振動デバイス

2 3. 実験方法

※実験中はイヤーマフを装着していただきます.

(a) 図 2 のパラメータ変更画面の上部に感性語対が提示されます.感性語対とは,反対の 意味をもつ 2 つの感性語からなる組み合わせです.実験中は 2 つの内どちらか一方が 黄色く大きな字で表示されるので,その感性語の印象に合致した振動パターンを作成 してください.

※図 2 の場合,「はっきりとした」をイメージする振動パターンを作成します.

(b) 「Check Vibration」ボタンを押して振動を確認しながら,ビデオで説明したように 提示された感性語をイメージする振動パターンの作成を行います.

※一度も振動を確認することなく作成を終了することはやめてください.

※実験中は何度でも振動を確認していただいて構いません.

※横軸を時間いっぱい(1000 ms)使わなくても構いません.

(c) 作成終了後,振動パターン生成システムのキーボードの「Enter」キーを押すと図 3 の実験待機画面に遷移するので,実験担当者に作成を終えたことを知らせてください.

(d) 実験待機画面表示中,実験担当者に口頭にて作成した振動パターンがどのような振動 か,提示された感性語を使わずに説明していただきます.

※説明内容はボイスレコーダーにて記録させていただきます.

(e) 振動を説明後,隣に用意するサブ PC にて図 4 に示す評価画面の質問に回答していた だきます.質問内容は「自分が生成した振動パターンは提示された感性語にふさわし いパターンである.」です.感性語の提示方法は図 2 と同様で,サブ PC のキーボード にて 1~7 の数字の中からふさわしいと思う数字を入力すると図 5 のように選択中の 数字が黄色く表示されるので,「Enter」キーにて確定します.

※確定するまでは何度でも数値を変更することができます