Home   >   CSC-OpenAccess Library   >   Social & Management Sciences   >  International Journal of Recent Trends in Human Computer Interaction (IJHCI)
International Journal of Recent Trends in Human Computer Interaction (IJHCI)
An International peer-review journal operated under CSC-OpenAccess Policy.
ISSN - 2180-1347
Published - Bi-Monthly   |   Established - 2010   |   Year of Publication - 2024

SUBMISSION
March 31, 2024

NOTIFICATION
April 30, 2024

PUBLICATION
May 31, 2024

    
HOME   About IJHCI   Editorial Board   Call For Papers/Editors   Submission Guidelines   Citation Report   IJHCI Publications   Subscribe IJHCI
VIDEO PRESENTATIONS
Visit Video Section to see author video presentations on their publications.
 
 
RESEARCH CENTERS, INSTITUTES & UNIVERSITIES
 
SEE COMPLETE LIST OF PUBLICATIONS
 

IJHCI CITATION IMPACT
3.346

Refer to In-Process Citation Report for IJHCI for complete details.
 
LIST OF JOURNALS
Complete list of Open Access journals with free access its publications.
 
For Inquiries & Fast Response cscpress@cscjournals.org

IJHCI - CITATION REPORT (IN-PROCESS)

Below calculations are based on citations that are extracted through Google Scholar.

Total Citations = 174
Self Citations = 0
Total Publications = 52

Citation Impact
(Total Citations - Self Citations) / Total Publications

Citation Impact
(174 - 0) / 52 = 3.346

 
SR
M-CODE
CITATION
1
Gengeswari, K., & Sharmeela-Banu, S. A. (2016). Revealing the Underlying Insights on the Use of Social Media by Foreign Students—A Qualitative Approach. Journal Of Business Theory And Practice, 4(1), 139-150.
2
Shrawankar, U., & Dixit, S. S. (2016). Listening deaf through Tactile sign language. In Proceedings of the 10th INDIACom, 3rd International Conference on Computing for Sustainable Global Development, New Delhi (INDIA), IEEE, 16th–18th March.
3
Zhang, C. (2016). Study on Environmental Control System for People with Serious Disabilities.
4
Zhao, M. Y., Ong, S. K., & Nee, A. Y. C. (2016). An Augmented Reality-assisted Therapeutic Healthcare Exercise System Based on Bare-hand Interaction. International Journal of Human-Computer Interaction, (just-accepted).
5
Kirisci, P. T. (2016). Gestaltung mobiler Interaktionsgeräte. Springer Fachmedien Wiesbaden.
6
Zhang, C. (2016). Study on Environmental Control System for People with Serious Disabilities.
7
Arai, K. (2016). Computer Input Just by Sight and Its Applications in Particular for Disable Persons. In Information Technology: New Generations (pp. 995-1007). Springer International Publishing.
8
Kharate, G. K., & Ghotkar, A. S. (2016). VISION BASED MULTI-FEATURE HAND GESTURE RECOGNITION FOR INDIAN SIGN LANGUAGE MANUAL SIGNS. International Journal on Smart Sensing & Intelligent Systems, 9(1).
9
Arai, K. (2016). Computer Input Just by Sight and Its Applications in Particular for Disable Persons. In Information Technology: New Generations (pp. 995-1007). Springer International Publishing.
10
Hassanat, A. B., Alkasassbeh, M., Al-awadi, M., & Esra'a, A. A. (2016). Color-based object segmentation method using artificial neural network. Simulation Modelling Practice and Theory, 64, 3-17.
11
Negahban, A., Kim, D. J., & Kim, C. (2016). Unleashing the Power of mCRM: Investigating Antecedents of Mobile CRM Values from Managers’ Viewpoint. International Journal of Human-Computer Interaction, (just-accepted).
12
Arai, K. (2016). Computer Input Just by Sight and Its Applications in Particular for Disable Persons. In Information Technology: New Generations (pp. 995-1007). Springer International Publishing.
13
Arai, K. (2016). Computer Input Just by Sight and Its Applications in Particular for Disable Persons. In Information Technology: New Generations (pp. 995-1007). Springer International Publishing.
14
Nazaran, A., Wisco, J. J., Hageman, N., Schettler, S. P., Wong, A., Vinters, H. V., ... & Bangerter, N. K. (2016). Methodology for computing white matter nerve fiber orientation in human histological slices. Journal of neuroscience methods, 261, 75-84.
15
Shamim, A., Balakrishnan, V., Tahir, M., & Ahsan Qureshi, M. (2016). Age and domain specific usability analysis of opinion visualisation techniques. Behaviour & Information Technology, 1-10.
16
于江麗. (2016). Study on Computer Input Device Using Head Movement for People with Disabilities.
17
Alshamari, M. (2016). Assessing and Prioritizing Usability Factors in Health Information System. Trends in Applied Sciences Research, 11(1), 26.
18
張超. (2016). Study on Environmental Control System for People with Serious Disabilities.
19
Darwish, S. M., Madbouly, M. M., & Khorsheed, M. B. (2016). Hand Gesture Recognition for Sign Language: A New Higher Order Fuzzy HMM Approach. International Journal of Engineering and Technology, 8(3), 157.
20
Han, B. (2015). The Multimodal Interaction through the Design of Data Glove (Doctoral dissertation, Université d'Ottawa/University of Ottawa).
21
Zhang, C., Ishimatsu, T., Yu, J., Murray, L., & Shi, L. (2015, August). Vision-based displacement sensor for people with serious spinal cord injury. In Mechatronics and Automation (ICMA), 2015 IEEE International Conference on (pp. 772-777). IEEE.
22
Saktel, P., Domke, M., & Vilhekar, L. Performance Improvement of Context Identification for Human Computer Interaction.
23
Wong, C. H., Tan, G. W. H., Tan, B. I., & Ooi, K. B. (2015). Mobile advertising: the changing landscape of the advertising industry. Telematics and Informatics, 32(4), 720-734.
24
Iio Chun (2015) for beginners to ke pu ro Jewellery Getting ミ nn gu education ni off su ru Awareness Survey -GUI ? CUI ?-do ? ra ga yo ? ka -.?. Intelligence Education ? nn Polyster ji ? Rousseau 2015 Proceedings, 2015, 17-22.
25
Arai, K. (2015). Computer Input by Human Eyes Only and It’s Applications. In Intelligent Systems in Science and Information 2014 (pp. 1-22). Springer International Publishing.
26
Cook, H., Nguyen, Q. V., Simoff, S., Trescak, T., & Preston, D. (2015, September). A Close-Range Gesture Interaction with Kinect. In Big Data Visual Analytics (BDVA), 2015 (pp. 1-8). IEEE.
27
Arai, K. (2015). Computer Input by Human Eyes Only and It’s Applications. In Intelligent Systems in Science and Information 2014 (pp. 1-22). Springer International Publishing.
28
Zhang, C., Shibata, M., Takashima, K., Yu, J., Ishimatsu, T., & Palomino, J. (2015). An Environmental Control System for ALS Patient Using Finger Movement. Modern Mechanical Engineering, 5(04), 122.
29
Abreu, J., Almeida, P., & Silva, T. (2015). A UX Evaluation Approach for Second-Screen Applications. In Applications and Usability of Interactive TV (pp. 105-120). Springer International Publishing.
30
Tzionas, D., Ballan, L., Srikantha, A., Aponte, P., Pollefeys, M., & Gall, J. (2015). Capturing Hands in Action using Discriminative Salient Points and Physics Simulation. arXiv preprint arXiv:1506.02178.
31
Zhang, C., Ishimatsu, T., Yu, J., Murray, L., & Shi, L. (2015, August). Vision-based displacement sensor for people with serious spinal cord injury. In Mechatronics and Automation (ICMA), 2015 IEEE International Conference on (pp. 772-777). IEEE.
32
Nabati, M., & Behrad, A. (2015). 3D Head pose estimation and camera mouse implementation using a monocular video camera. Signal, Image and Video Processing, 9(1), 39-44.
33
Cherng, L. Y., Malim, N. H. A. H., & Singh, M. M. (2015). Trend analysis in ageing and ict research. jurnal teknologi, 76(1).
34
Abreu, J., Almeida, P., & Silva, T. (2015). Enriching Second-Screen Experiences with Automatic Content Recognition. In VI International Conference on Interactive Digital TV IV Iberoamerican Conference on Applications and Usability of Interactive TV (p. 41).
35
Arai, K. (2015). Psychological Status Monitoring with Cerebral Blood Flow, Electroencephalogram and Electro-oculogram Measurements.
36
Tait, M., & Billinghurst, M. N. (2015). U.S. Patent No. 9,146,618. Washington, DC: U.S. Patent and Trademark Office.
37
Arai, K. Psychological Status Monitoring with Cerebral Blood Flow: CBF, Electroencephalogram: EEG and Electro-Oculogram: EOG Measurements.
38
Arai, K. (2015). Relations between Psychological Status and Eye Movements. Signal, 4(6).
39
Guran, A. M., & Cojocar, G. S. O propunere de arhitecturǎ pentru evaluarea metricilor de execuţie folosind AOP.
40
Zhang, C., Ishimatsu, T., Yu, J., Murray, L., & Shi, L. (2015, August). Vision-based displacement sensor for people with serious spinal cord injury. In Mechatronics and Automation (ICMA), 2015 IEEE International Conference on (pp. 772-777). IEEE.
41
Raffle, H. S., Patel, N., & Braun, M. B. (2015). U.S. Patent No. 9,213,403. Washington, DC: U.S. Patent and Trademark Office.
42
Arai, K. (2015). Computer Input by Human Eyes Only and It’s Applications. In Intelligent Systems in Science and Information 2014 (pp. 1-22). Springer International Publishing.
43
Arai, K. (2015). Psychological Status Monitoring with Cerebral Blood Flow, Electroencephalogram and Electro-oculogram Measurements.
44
Yoshimura, H., Hori, M., Shimizu, T., & Iwai, Y. (2015). Appearance-Based Gaze Estimation for Digital Signage Considering Head Pose. International Journal of Machine Learning and Computing, 5(6), 507.
45
Wise, S. E. N. Worldwide Hospitality and Tourism Themes.
46
Kovalenko, M., Antoshchuk, S., & Sieck, J. (2015, May). Human action recognition using a semantic-probabilistic network. In Emerging Trends in Networks and Computer Communications (ETNCC), 2015 International Conference on (pp. 67-72). IEEE.
47
Ma, F., Wang, H., & Sun, Z. (2015). The Design and Implementation of Natural Human-Robot Interaction System Based on Kinect Sensor.
48
Kovalenko, M., Antoshchuk, S., & Hodovychenko, M. (2015, October). Event recognition using a semantic-probabilistic network. In Information Technologies in Innovation Business Conference (ITIB), 2015 (pp. 35-38). IEEE.
49
Takizawa, H., Yamaguchi, S., Aoyagi, M., Ezaki, N., & Mizuno, S. (2015). Kinect cane: an assistive system for the visually impaired based on the concept of object recognition aid. Personal and Ubiquitous Computing, 1-11.
50
Sonkusare, J. S., Chopade, N. B., Sor, R., & Tade, S. L. (2015, February). A Review on Hand Gesture Recognition System. In Computing Communication Control and Automation (ICCUBEA), 2015 International Conference on (pp. 790-794). IEEE.
51
Polash, M. M., Bhuiyan, N. A., & Kabir, M. H. Secured Dynamic Hand Gestures Detection System.
52
Smruti, S., Sahoo, J., Dash, M., & Mohanty, M. N. (2015, January). An Approach to Design an Intelligent Parametric Synthesizer for Emotional Speech. In Proceedings of the 3rd International Conference on Frontiers of Intelligent Computing: Theory and Applications (FICTA) 2014 (pp. 367-374). Springer International Publishing.
53
Megha, J. V., Padmaja, J. S., & Doye, D. D. Radially Defined Local Binary Patterns for Hand Gesture Recognition.
54
Easton, S., & Wise, N. (2015). Online portrayals of volunteer tourism in Nepal: exploring the communicated disparities between promotional and user-generated content. Worldwide Hospitality and Tourism Themes, 7(2), 141-158.
55
Sheth, K., & Futane, P. R. Indian Sign Language Recognition using Hybrid Video Segmentation Approach.
56
Martin-SanJose, J. F., Juan, M. C., Mollá, R., & Vivó, R. (2015). Advanced displays and natural user interfaces to support learning. Interactive Learning Environments, 1-18.
57
Rungruangbaiyok, S., Duangsoithong, R., & Chetpattananondh, K. (2015, June). Ensemble Threshold Segmentation for hand detection. In Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology (ECTI-CON), 2015 12th International Conference on (pp. 1-5). IEEE.
58
Alqahtani, M. A., Alhadreti, O., AlRoobaea, R. S., & Mayhew, P. J. (2015). Investigation into the Impact of the Usability Factor on the Acceptance of Mobile Transactions: Empirical Study in Saudi Arabia. International Journal of Human Computer Interaction (IJHCI), 6(1), 1.
59
Escalona Neira, I. F. (2014). Interfaz humano máquina controlada por gestos.
60
Lanzola, G., Parimbelli, E., Micieli, G., Cavallini, A., & Quaglini, S. (2014). Data quality and completeness in a web stroke registry as the basis for data and process mining. Journal of healthcare engineering, 5(2), 163-184.
61
Al-Tairi, Z. H., Rahmat, R. W. O., Saripan, M. I., & Sulaiman, P. S. (2014). Skin Segmentation Using YUV and RGB Color Spaces. JIPS, 10(2), 283-299.
62
Song, F., Tan, X., Liu, X., & Chen, S. (2014). Eyes closeness detection from still images with multi-scale histograms of principal oriented gradients. Pattern Recognition, 47(9), 2825-2838.
63
Ghotkar, A. S., & Kharate, G. K. (2014). Study of vision based hand gesture recognition using Indian sign language. Computer, 55, 56.
64
Singhai, S., & Satsangi, C. S. (2014). Hand Segmentation for Hand Gesture Recognition.
65
Bidgar, G., Autade, M., Marathe, P., & Aher, S. Hand Gesture Recognition for HCI (Human-Computer Interaction) using Artificial Neural Network.
66
Shi, Y., Yan, Z., Ge, H., & Mei, L. (2014). Visual Objects Location Based on Hand Eye Coordination. In Future Information Technology (pp. 403-408). Springer Berlin Heidelberg.
67
Miehling, M. J. (2014). Correlation of affiliate performance against web evaluation metrics (Doctoral dissertation, Edinburgh Napier University).
68
Kovalenko, M., Antoshchuk, S., & Sieck, J. (2014, March). Real-Time Hand Tracking and Gesture Recognition Using Semantic-Probabilistic Network. In Computer Modelling and Simulation (UKSim), 2014 UKSim-AMSS 16th International Conference on (pp. 269-274). IEEE.
69
Sharma, K., & Garg, N. K. (2014). International Journal of Computer Application and Technology.
70
Zahran, D. I., Al-Nuaim, H. A., Rutter, M. J., & Benyon, D. (2014). A comparative approach to web evaluation and website evaluation methods. International Journal of Public Information Systems, 10(1).
71
Nyström, N., Kann, L., Pelz-Wall, T., & Bälter, O. (2014). Online user tracking and usability tools-a mapping study.
72
Tzionas, D., Srikantha, A., Aponte, P., & Gall, J. (2014). Capturing hand motion with an RGB-D sensor, fusing a generative model with salient points. In Pattern Recognition (pp. 277-289). Springer International Publishing.
73
Kaur, S., & Banga, V. K. Article: Boltay Hath for Indian Sign Language Recognition}. International Journal of Applied, 7, 1-7.
74
Potnis, S. S., Jahagirdar, A. P. A. S., & Jahagirdar, A. P. A. S. Nav view search.
75
Mangaiyarkarasi, M., & Geetha, A. Cursor control system using facial expressions for human-computer interactions. International Journal in Computer Science &Electronics (IJTCSE), Annamalai University, Chidambaram, India. ISSN, 0976-1353.
76
Khan, S. R., & Bhat, M. S. (2014, March). GUI based industrial monitoring and control system. In Power and Energy Systems Conference: Towards Sustainable Energy, 2014 (pp. 1-4). IEEE.
77
Braier, J., Lattenkamp, K., Räthel, B., Schering, S., Wojatzki, M., & Weyers, B. (2014). Haptic 3D Surface Representation of Table-Based Data for People With Visual Impairments. ACM Transactions on Accessible Computing (TACCESS), 6(1), 1.
78
Muttena, S., Sriram, S., & Shiva, R. (2014, November). Mapping gestures to speech using the kinect. In Science Engineering and Management Research (ICSEMR), 2014 International Conference on (pp. 1-5). IEEE.
79
Konwar, A. S., Borah, B. S., & Tuithung, C. T. (2014, April). An American Sign Language detection system using HSV color model and edge detection. In Communications and Signal Processing (ICCSP), 2014 International Conference on (pp. 743-747). IEEE.
80
Hasija, K., & Mehna, R. Analysis of various methodology of hand gesture recognition system using Matlab. International Journal of Advanced Engineering Research and Science, (5), 28-32.
81
Visittrakoon, P., Nhewbang, W., Jenjob, K., Waranusast, R., & Pattanathaburt, P. nuVision: A Mobility Aid for the Navigation of Visually Impaired People using Depth Information.
82
Richardson, L. J. (2014). Public archaeology in a digital age (Doctoral dissertation, UCL (University College London)).
83
Santos, W. R. D. (2014). Consumidor de mídias móveis: a influência do perfil do adotante na relação entre valor percebido e intenção de compra.
84
Nazemi, A., Murray, I., & McMeekin, D. A. (2014). Layout analysis for Scanned PDF and Transformation to the Structured PDF Suitable for Vocalization and Navigation. Computer and Information Science, 7(1), p162.
85
Shamim, A., Balakrishnan, V., & Tahir, M. (2014). Evaluation of opinion visualization techniques. Information Visualization, 1473871614550537.
86
Yang, S., Wang, Y., & Wei, J. (2014). Integration and consistency between web and mobile services. Industrial Management & Data Systems, 114(8), 1246-1269.
87
Evreinova, T. V., Evreinov, G., & Raisamo, R. (2014). An Exploration of Volumetric Data in Auditory Space. Journal of the Audio Engineering Society, 62(3), 172-187.
88
Susi, G., Cristini, A., Salerno, M., & Daddario, E. (2014, November). A low-cost indoor and outdoor terrestrial autonomous navigation model. In Telecommunications Forum Telfor (TELFOR), 2014 22nd (pp. 675-678). IEEE.
89
Arai, K. Psychological Status Monitoring with Cerebral Blood Flow: CBF, Electroencephalogram: EEG and Electro-Oculogram: EOG Measurements.
90
Alroobaea, R., & Mayhew, P. J. (2014, August). How many participants are really enough for usability studies?. In Science and Information Conference (SAI), 2014 (pp. 48-56). IEEE.
91
Köse, G. (2014). Sinirli Alanlarda Konu Tespit ve Takibi Için Genisletilmis Bir Mimari Yapi Önerisi.
92
Kaur, S., Banga, V., & Amritsar, I. (2014). Boltay Hath for Indian Sign Language Recognition. International Journal of Applied Information Systems, 7(1), 1-7.
93
AlRoobaea, R., & Mayhew, P. J. (2014, August). The impact of usability on e-marketing strategy in international tourism industry. In Science and Information Conference (SAI), 2014 (pp. 961-966). IEEE.
94
Potnis, S. S., & Jahagirdar, A. P. A. S. Real Time Hand Gesture Recognition for Smart Classroom Environment.
95
Sin, A. K., Ahmad, A., Zaman, H. B., & Sulaiman, R. (2014, November). A wearable device for the elderly: A case study in Malaysia. In Information Technology and Multimedia (ICIMU), 2014 International Conference on (pp. 318-323). IEEE.
96
Hartanto, C. (2014). Service Design in Application of Renewing and Redefining User Interface of Smart TV. Taipei University of Technology Interactive Media Design Institute dissertations, 1-96.
97
Evreinova, T. V., Evreinov, G., & Raisamo, R. From kinesthetic sense to new interaction concepts: feasibility and constraints.
98
Feng, K. P., Wan, K., & Luo, N. (2013, July). Natural Gesture Recognition Based on Motion Detection and Skin Color. In Applied Mechanics and Materials (Vol. 321, pp. 974-979).
99
Ghotkar, A. S., & Kharate, G. K. (2013). Vision based Real Time Hand Gesture Recognition Techniques for Human Computer Interaction. International Journal of Computer Applications, 70(16).
100
Arai, K. (2013). Method for 3D Rendering Based on Intersection Image Display Which Allows Representation of Internal Structure of 3D objects. INTERNATIONAL JOURNAL OF ADVANCED RESEARCH IN ARTIFICIAL INTELLIGENCE, 2(6).
101
Pansare, J. R., Dhumal, H., Babar, S., Sonawale, K., & Sarode, A. (2013). Real Time Static Hand Gesture Recognition System in Complex Background that uses Number system of Indian Sign Language. International Journal of Advanced Research in Computer Engineering and Technology (IJARCET), 2(3), 1086-1090.
102
Dhruva, N., Rupanagudi, S. R., & Kashyap, H. N. (2013). Novel Algorithm for Image Processing based Hand Gesture Recognition and its Application in Security. In Advances in Computing, Communication, and Control (pp. 537-547). Springer Berlin Heidelberg.
103
Priyadharshni, V., Jose, M. J., Anand, M. S., Kumaresan, A., & Kumar, N. M. (2013). Hybrid Image Segmentation Using Edge Detection With Fuzzy Thresholding For Hand Gesture Image Recoginition. International Journal of Innovative Research and Development, 2(5).
104
Bakminseon, gangseunghun, and chaeoksam. (2013). Robust hand detection and tracking using sensor fusion. Journal of Information Science: Software and Applications, 40 (9), 558-566.
105
Patidar, S., & Satsangi, D. C. (2013). Hand Segmentation and Tracking Technique using Color Models. Hand, 1(2).
106
Boughnim, N., Marot, J., Fossati, C., & Bourennane, S. (2013). Hand posture recognition using jointly optical flow and dimensionality reduction. EURASIP Journal on Advances in Signal Processing, 2013(1), 1-22.
107
Arai, K. (2013). Lecturer’s e-Table (Server Terminal) Which Allows Monitoring the Location at Where Each Student is Looking During Lessons with e-Learning Contents Through Client Terminals. INTERNATIONAL JOURNAL OF ADVANCED RESEARCH IN ARTIFICIAL INTELLIGENCE, 2(6).
108
Tumkor, S., Esche, S. K., & Chassapis, C. (2013, November). Hand Gestures in CAD Systems. In ASME 2013 International Mechanical Engineering Congress and Exposition (pp. V012T13A008-V012T13A008). American Society of Mechanical Engineers.
109
Sarkar, A. R., Sanyal, G., & Majumder, S. (2013). Hand gesture recognition systems: a survey. International Journal of Computer Applications, 71(15).
110
Roobaea AlRoobaea, Ali H. Al-Badi and Pam J. Mayhew, “Generating a Domain Specific Inspection Evaluation Method through an Adaptive Framework” International Journal of Advanced Computer Science and Applications(IJACSA), 4(6), 2013.
111
Shrawankar, U., & Thakare, V. (2013). An Adaptive Methodology for Ubiquitous ASR System. arXiv preprint arXiv:1303.3948.
112
Germonprez, M., Allen, J. P., Warner, B., Hill, J., & McClements, G. (2013). Open source communities of competitors. interactions, 20(6), 54-59.
113
Nazemi, A., & Murray, I. (2013). An open source reading system for print disabilities.
114
Alroobaea, R. S., Al-Badi, A. H., & Mayhew, P. J. (2013, November). Generating a Domain Specific Checklist through an Adaptive Framework for Evaluating Social Networking Websites. In IJACSA) International Journal of Advanced Computer Science and Applications, Special Issue on Extended Papers from Science and Information Conference (p. 25).
115
Roobaea AlRoobaea, Ali H. Al-Badi and Pam J. Mayhew (2013), " The Impact of the Combination between Task Designs and Think-Aloud Approaches on Website Evaluation” Journal of Software and Systems Development, Vol. 2013 (2013), Article ID 172572, DOI: 10.5171/2013. 172572
116
AlRoobaea, R., Al-Badi, A. H., & Mayhew, P. J. Research Article The Impact of the Combination between Task Designs and Think-Aloud Approaches on Website Evaluation.
117
Coşkunçay, D. F. (2013). Identifying the Factors Affecting Users’ Adoption of Social Networking. International Journal of Human Computer Interaction (IJHCI), 4(1), 1.
118
Arai, K., & Mardiyanto, R. (2013). Method for psychological status estimation by gaze location monitoring using eye-based human-computer interaction. Editorial Preface, 4(3).
119
Ghani, M. U., Chaudhry, S., Sohail, M., & Geelani, M. N. (2013, December). GazePointer: A real time mouse pointer control implementation based on eye gaze tracking. In Multi Topic Conference (INMIC), 2013 16th International (pp. 154-159). IEEE.
120
Arai, K. (2013). Method for 3D Rendering Based on Intersection Image Display Which Allows Representation of Internal Structure of 3D objects. INTERNATIONAL JOURNAL OF ADVANCED RESEARCH IN ARTIFICIAL INTELLIGENCE, 2(6).
121
Kulkarni, S., & Gala, S. (2013). A Simple Algorithm for Eye Detection and Cursor Control. International Journal of Computer Technology and Applications, 4(6), 880.
122
MISHRA, S., & VAISH, A.Key factors leading roi of e-commerce websites–an user’s perspective.
123
Alfarras, M. (2013). Early Detection of Adult Valve Disease-Mitral Stenosis Using The Elman Artificial Neural Network. International Journal of Computer Science and Network Security (IJCSNS), 13(11), 101.
124
Roobaea S. AlRoobaea, Ali H. Al-Badi and Pam J. Mayhew, “Generating an Educational Domain Checklist through an Adaptive Framework for Evaluating Educational Systems” International Journal of Advanced Computer Science and Applications(IJACSA), 4(8), 2013.
125
Beri, B., & Singh, P. (2013). Web Analytics: Increasing Website’s Usability and Conversion Rate. International Journal of Computer Applications, 72(6), 35-38.
126
Mäesalu, H. (2013). Automated Rule-Based Selection and Instantiation of Layout Templates for Widget-Based Microsites (Doctoral dissertation, MSc thesis, Institute of Computer Science, University of Tartu, Tartu, Estonia).
127
Dunlea, A. L. (2013). Evaluating Usability Evaluations.
128
Mäesalu, H. (2013). Automaatne reeglitel põhinev veebilehe struktuurimallide valimine ja rakendamine (Doctoral dissertation, Tartu Ülikool).
129
Coskunçay, D. F. (2013).Identifying the Factors Affecting Users’ Adoption of Social Networking. International Journal of Human Computer Interaction (IJHCI), 4(1), 1.
130
Bozkurt, A., & Bozkaya, M. (2013). Bir Ögrenme Malzemesi Olarak Etkilesimli E-Kitap Hazirlama Adimlari. Egitimde Politika Analizi, 2(2), 8-20.
131
Ivan, L., & Fernandez-Ardevol, M. (2013). Older People and Mobile Communication in Two European Contexts. Revista Româna de Comunicare si Relatii Publice, (3), 83-98.
132
Fernández-Ardèvol, M., & Ivan, L. (2013). Older People and Mobile Communication in Two European Contexts. Romanian Journal of Communication and Public Relations Revista românã de Comunicare ºi Relaþii Publice, 83.
133
Poyil, A. T., & Hema, C. R. Algorithms for Brain Mobile Phone Interfaces: A Survey.
134
Rianto, R. F. (2013). Usability Testing on Flight Searching Website Using Heuristic Evaluation. Information Systems, 2, 4.
135
Arai, K. (2013). Lecturer’s e-Table (Server Terminal) Which Allows Monitoring the Location at Where Each Student is Looking During Lessons with e-Learning Contents Through Client Terminals. INTERNATIONAL JOURNAL OF ADVANCED RESEARCH IN ARTIFICIAL INTELLIGENCE, 2(6).
136
Arai, K., & Mardiyanto, R. (2013). Method for psychological status estimation by gaze location monitoring using eye-based human-computer interaction. Editorial Preface, 4(3).
137
Arai, K. (2013). Lecturer’s e-Table (Server Terminal) Which Allows Monitoring the Location at Where Each Student is Looking During Lessons with e-Learning Contents Through Client Terminals. INTERNATIONAL JOURNAL OF ADVANCED RESEARCH IN ARTIFICIAL INTELLIGENCE, 2(6).
138
Arai, K. (2013). Lecturer’s e-Table (Server Terminal) Which Allows Monitoring the Location at Where Each Student is Looking During Lessons with e-Learning Contents Through Client Terminals. INTERNATIONAL JOURNAL OF ADVANCED RESEARCH IN ARTIFICIAL INTELLIGENCE, 2(6).
139
Arai, K., & Mardiyanto, R. (2013). Method for psychological status estimation by gaze location monitoring using eye-based human-computer interaction. Editorial Preface, 4(3).
140
Ryu, J. S., & Murdock, K. (2013). Consumer acceptance of mobile marketing communications using the QR code. Journal of Direct, Data and Digital Marketing Practice, 15(2), 111-124.
141
Arai, K. (2013). Method for 3D Rendering Based on Intersection Image Display Which Allows Representation of Internal Structure of 3D objects. INTERNATIONAL JOURNAL OF ADVANCED RESEARCH IN ARTIFICIAL INTELLIGENCE, 2(6).
142
Ma Ying Hung, Yang Jiawei, Hui Lei put, & Li Ye. (2012). A novel hand movement tracking system. Xi'an University of Electronic Science and Technology, 39 (1), 79-85.
143
Poyil, A. T., & Hema, C. R. (2012). EEG Feature Extraction in Brain-Mobile Phone Interfaces. Methodology, 2014.
144
Shima, K., Fukuda, O., Tsuji, T., Otsuka, A., & Yoshizumi, M. (2012, June). EMG-based control for a feeding support robot using a probabilistic neural network. In Biomedical Robotics and Biomechatronics (BioRob), 2012 4th IEEE RAS & EMBS International Conference on (pp. 1788-1793). IEEE.
145
Arai, K. (2012). Method for leaning efficiency improvements based on gaze location notifications on e-learning content screen display. International Journal of Advanced Research in Artificial Intelligence, 1(3), 1-6.
146
Li, Z., Sun, W., & Wang, L. (2012, October). A neural network based distributed intrusion detection system on cloud platform. In Cloud Computing and Intelligent Systems (CCIS), 2012 IEEE 2nd International Conference on (Vol. 1, pp. 75-79). IEEE.
147
Kohei Arai. (2012). Human-computer interaction and its application system based on the line-of-sight. Image e-Journal, 41 (3), 296-301.
148
Kohei Arai. (2012). Human-computer interaction and its application system based on the line-of-sight. Image e-Journal, 41 (3), 296-301.
149
Liu, L. (2012). Master of Science Degree in Engineering (Doctoral dissertation, The University of Toledo).
150
Arai, K. (2012). Method for leaning efficiency improvements based on gaze location notifications on e-learning content screen display. International Journal of Advanced Research in Artificial Intelligence, 1(3), 1-6.
151
Agustian, I., & Risanuri, H. (2012). Mouse Kamera dengan Deteksi Wajah Realtime dan Deteksi Kedip Berbasis Metode Haarcascade dan SURF.
152
Arai, K., & XiaoYu, G. (2012). Method for 3D Object of Content Representation and Manipulation on 2D Display Using Human Eyes Only. 한국콘텐츠학회 2012 추계종합학술대회, 49-50.
153
Arai, K. (2012). Method for leaning efficiency improvements based on gaze location notifications on e-learning content screen display. International Journal of Advanced Research in Artificial Intelligence, 1(3), 1-6.
154
Kohei Arai. (2012). Human-computer interaction and its application system based on the line-of-sight. Image e-Journal, 41 (3), 296-301.
155
Kohei Arai, & Mardiyanto, R. (2012). Helper robot. Image e-Journal with the voice communication capabilities and line-of-sight cruise control function, 41 (5), 535-542.
156
Kohei Arai, & Mardiyanto, R. (2012). Helper robot. Image e-Journal with the voice communication capabilities and line-of-sight cruise control function, 41 (5), 535-542.
157
Sunitha, K. V., & Sharada, A. (2012). Dynamic Construction of Telugu Speech Corpus for Voice Enabled Text Editor. International Journal of Human Computer Interaction (IJHCI), 3(4), 83.
158
Kohei Arai, & Mardiyanto, R. (2012). Helper robot. Image e-Journal with the voice communication capabilities and line-of-sight cruise control function, 41 (5), 535-542.
159
Kohei Arai, & Mardiyanto, R. (2012). Robot arm control and feeding support system based on the line-of-sight input. The Institute of Electrical Engineers Journal C (Electronics and Information Systems Division magazine), 132 (3), 416-423.
160
Dinh, H., Jovanov, E., & Adhami, R. (2012). Eye blink detection using intensity vertical projection. In International Multi-Conference on Engineering and Technological Innovation: IMETI.
161
Arai, K., & XiaoYu, G. (2012). Method for 3D Object of Content Representation and Manipulation on 2D Display Using Human Eyes Only. 한국콘텐츠학회 2012 추계종합학술대회, 49-50.
162
Kohei Arai. (2012). Human-computer interaction and its application system based on the line-of-sight. Image e-Journal, 41 (3), 296-301.
163
Kohei Arai, & Mardiyanto, R. (2012). Robot arm control and feeding support system based on the line-of-sight input. The Institute of Electrical Engineers Journal C (Electronics and Information Systems Division magazine), 132 (3), 416-423.
164
Lee, H. J., Choi, M. H., & Park, M. K. (2012). The Effects of Self-Efficacy and User's Cognitive Factors on Reuse Intention of SNS. Journal of the Korean Society for information Management, 29(3), 145-167.
165
Kohei Arai, & Mardiyanto, R. (2012). Helper robot. Image e-Journal with the voice communication capabilities and line-of-sight cruise control function, 41 (5), 535-542.
166
Kohei Arai, & Mardiyanto, R. (2012). Robot arm control and feeding support system based on the line-of-sight input. The Institute of Electrical Engineers Journal C (Electronics and Information Systems Division magazine), 132 (3), 416-423.
167
Kohei Arai, & Mardiyanto, R. (2012). Robot arm control and feeding support system based on the line-of-sight input. The Institute of Electrical Engineers Journal C (Electronics and Information Systems Division magazine), 132 (3), 416-423.
168
Arai, K., & XiaoYu, G. (2012). Method for 3D Object of Content Representation and Manipulation on 2D Display Using Human Eyes Only. 한국콘텐츠학회 2012 추계종합학술대회, 49-50.
169
Al-Hawamdeh, E. F. (2012). Eye Based Human Computer Interaction using One Button and Single Line Moving Keyboard. Eye, 3(6).
170
Arai, K., & XiaoYu, G. (2012). Method for 3D Object of Content Representation and Manipulation on 2D Display Using Human Eyes Only. 한국콘텐츠학회 2012 추계종합학술대회, 49-50.
171
Arai, K. (2012). Method for leaning efficiency improvements based on gaze location notifications on e-learning content screen display. International Journal of Advanced Research in Artificial Intelligence, 1(3), 1-6.
172
Arai, K., & Mardiyanto, R. (2011). Eye-based human-computer interaction allowing phoning, reading e-book/e-comic/e-learning, Internet browsing and TV information extraction. International Journal of Advanced Computer Science and Applications, 2(12).
173
Arai, K., & Mardiyanto, R. (2011). Autonomous control of eye based electric wheel chair with obstacle avoidance and shortest path finding based on Dijkstra algorithm. International Journal of Advanced Computer Science and Applications, 2(12), 19-25.
174
WEI, E. S. (2011). Factors affecting the adoption of touchscreen mobile phones: an empirical study among Generation Y (Doctoral dissertation, UNIVERSITI TUNKU ABDUL RAHMAN).