# I. Introduction ccess to education is a fundamental right of each child and making this access better is an obligation of the government. The emergence of IT and its utilization in the education sector has helped the students at all levels, to improve their capability to learn and without need to memorizing text but by learning the conceptual grounds and theories. Thus, IT has played its role in making the teaching and learning, not only interesting but also effective in the recent years. The role of IT Enabled Services (ITES) has been vital in the higher education institutes as well and now, as the baseline of the ITES has been established at most institutes it is becoming important to evaluate the quality of ITES at different institutes. In this paper, we focus on two higher education institutions from public and private sector. We have chosen universities in Saudi Arabia as the study is focused to make a comparison of the ITES in Saudi universities. Considering the nature of the study, two leading universities, one each from government and private sector was selected to participate in the study as they exist in same city. The public sector university (referred as A in the rest of this paper)was established in the fifties and is one of the oldest university in the kingdom while the private sector university (referred as B in the rest of this paper) was established in the nineties. It is also important to mention that the current student enrolment at the private university is around 3,500 while the public sector university has 10 times mores enrolment, and so is the ratio in the staff of the universities. The purpose of this study is to compare the state of the ITES in the Saudi universities. # II. Literature Review In order to compare the state of the art it is important to establish the parameters based on which the comparisons among the universities can be made for the quality of ITES. Some recent work has been carried out in this domain which is presented in the this section. Several researchers [1] [5], including Alanezi and Yang have mentioned that the 'Accessibility' factor is vital in nature for measuring the quality of ITES. Tan and Burgess [5][1] [4] have advocated the need for customization as a major player in the quantification of the ITES for the higher education while Parasuraman and George [1][6][2] [7] are of the view that delivery of teaching and the efficiency of the ITES is also important. [2][3][4] Alanezi, Lin, Sedera and Swaid [1][8] [9][10] have identified the importance and have advocated the existence of the factors like functionality and information quality. Both these factors form the core of ITES and are valuable in their nature and existence. Zeithaml [2] has found that some factors like response time, service usability, system integrity and trust are important factors in the quantification of the quality measurement. These factors govern the environmental factors and responsiveness of the system and are vital to measure the quality of the system instead of functionality of the system. Tan, George and Burgess [1][4] [7] have advocated the presence of security as an integral factor to measure the quality of ITES. Apart from that, some researchers like Burgess [5] have considered that the factors like site design, service usability and service reliability have a great value in the measurement of the quality of the ITES. Aziz [11] in her research shortlisted these seventeen items to evaluate the quality of the ITES in the higher education. The shortlisting was done from more than 100 elements based on the recurrence, relevance and importance which was determined by the expert opinion. The factor, its description and the citation of the survey is given in Table 1. A 12 Service usability Service usability factor refers to the degree to which the users find it easy to use the various ITES. [2] 13 Site design Site design factor measures the quality of site design in terms of user satisfaction and ease of use. [5] 14 # System integrity The provision of consistent information at all times. [2] 15 Trust How reliable, efficient and responsive a system is. [2] 16 Usefulness Usefulness is the degree to which the users find it easier to do their work via the ITES. [5] 17 # User support User support factor refers to the degree to which the ITES department personnel are willing to serve the users in case their help and support is required. [3] The findings by Aziz [11] form the basis of this study. The findings are contemporary in nature and discuss an evolutionary paradigm of emerging state of the art from the authors of immense repute [12,13]. Ahead of this a considerably sound and current methodology to affirm the findings was used that increase the trust to use this findings of the publication as a base of this research. # III. Methodology This study is a mixed method research [14,15], that has been completed by triangulating the qualitative and quantitative results. The survey was conducted on 300 individuals in each institute and the results were collected. The purpose of the survey was to ask the users about the quality of IT enabled services at their respective institute, against the different factors attained after the comprehensive literature review. Likert scale [ 16] was used to rank the responses on a scale of 1-5, i.e. from poor to excellent, hence, the column 1 in each response list has the weightage 1, the 2 nd column has the weightage 2 and column 3 has the weightage of 3 and so on. Once the sums are accumulated they are divided by the number of total respondents to get the weighted average and this activity is run for both institutes separately. After that the comparison among the results is made by considering each factor to identify that in which area a specific institute is performing better. A qualitative study has been conducted on the same lines where four respondents were interviewed (two from each university) and were asked to identify the standards of the IT enabled service in their respective institutes based on the factors and considering the cotemporary situations [17][18][19][20][21][22][23][24]. In this research we follow the partially mixed sequential dominant status paradigm where the qualitative findings follow the quantitative findings and are dominant. This paradigm is followed in research studies that are centric to evaluate the technology education [25][26][27][28][29][30][31]. IV. # Quantitative Study Considering the scale of the survey it is important to maximize the responses, however it is notable that the responses have to be precise and should come from the experienced users [17,32]. In order to achieve this the means given in Table 2 are used to spread the survey and collect the responses. The effectiveness of these means is given in Table 3 while Figure 1 illustrates the spread of survey call. Confidence level demonstrates the level of confidence that we have on the response to be correct and precise. Usually a confidence level of 95% is used in the research although 99% is used. The confidence interval determines the amount of acceptable results, and is always presented with the ± symbol. If the threshold value is 67 and the confidence interval is 5, it will allow considering values from 62-72 as legitimate. Since the survey has been conducted in two different institute to compare the state of the art of IT enabled services, almost half of the responses came from each institute. A 5-level Likert scale has been used in this research that ranges from poor to excellent. The range is from 1-5 on a quantitative scale. The value for poor is 1 and value for excellent is 5. Every response that choses the 'poor' against some item is multiplied by 1 while the selections like 'somewhat acceptable' is multiplied by2, the choice 'acceptable' is multiplied by 3, the choice 'very good' is multiplied by 4, and the choice 'excellent' is multiplied by 5. The average weighted response is achieved by divining the weighted response over the total number of respondents. It is further important that some questions were not answered by some individuals. For institute A, 261 respondents have responded while some 325 respondents responded for the institute B. # Global Journal of Computer Science and Technology Volume XVI Issue IV Version I ( ) V. Qualitative Study and Triangulation Table 5 and Table 6 summarize the survey response statistics from institute A and B respectively. The results shown in Table 7 , clearly demonstrate that the quality of ITES is better in institute A as compared to institute B in all the factors. Considering these results a qualitative study was formulated where four interviews were conducted to gain an insight of the ITES in the respective institutes. The outcome is given in Table 8. Along with the illustrative description of the ITES quality items, the interviewees preferred to give the absolute numbers in measuring the quality. Four interviews were conducted in total Two interviews were conducted in institute A while rest two were conducted at institute B. The summary of the results is presented in Table 8 which clearly demonstrates that the interviewees (like the survey respondents) believed that the quality of ITES is better in institute A as compared to institute B. In the survey, institute A was observed having lead in the quality factors while in the interviews institute A leads in 12 out of 17 factors, equal in 4, and lags in 1 factor. Figures 2 and 3 depict the quantitative and qualitative analysis respectively. In triangulation process, it is observed that whether the findings of the qualitative method and the quantitative methods converge to similar results? The triangulation process is shown in Table 9. # VI. Discussion There are 17 factors for measuring the quality of ITES in the institutes in Saudi Arabia. Two intuitions, one government and one private university was selected for this purpose in the capital city of Riyadh. The results of the study demonstrate that the quality of the ITES is better in institute A as compared to B. After the completion of the triangulation process the results have not changed much from the initial process, since the findings were very much consistent in the quantitative and qualitative methods. For the factors like 'accessibility', 'delivery of teaching', 'efficiency', 'information quality', 'inter-operability', 'privacy', 'security', 'service reliability', 'service usability', 'site design', 'system integrity', and 'user support' the results of the qualitative and quantitative findings were same. For the factors 'customization', 'functionality, 'trust', and 'usefulness'. the qualitative findings are different from the quantitative findings where in the survey it was established that the institute A is better as compared to institute B but in the interview it was established that both institutes have same standing. It was mentioned in the methodology section that the qualitative findings will have the dominance on the quantitative findings, therefore the qualitative results are observed in case of a disagreement among the qualitative and quantitative findings. Since the results of the qualitative finding demonstrate that the state-of-art of two institutions for these four factors is not different therefore the qualitative findings hold. For one factor 'response time' in the quantitative findings it was observed that the institute A is better in comparison while the results of the qualitative findings are otherwise, but for the reasons mentioned above, the qualitative results are held. # VII. Conclusion It can be summarized that the in order to compare the state-of-art of ITES in Saudi universities 17 factors were identified. Two institutions were compared based on quantitative and qualitative data, and the results have shown that institute A leads with better score on 12 factors while for four factors the scores were equal, while institute B leads only in one factor. It can be concluded that the state-of-art of ITES is much better in institute A as compared to institute B. Institute B needs to be more concerned in improving the quality of the ITES, especially in the areas of accessibility, information security, privacy, and user support. While Institute A needs to improve in customization, usefulness, response time, and trust. # VIII. A cknowledgement 162016![Quality of IT Enabled Services in Higher Education Institutions in Saudi ArabiaGlobal Journal of Computer Science and TechnologyVolume XVI Issue IV Version I ( )](image-2.png "16 Year 2016") 1NoFactorDescription1AccessibilityAccessibility is the degree to which the user can access the required service[1][2][3][4][5]2CustomizationThe ability to configure the ITES according to requirement[5][1][4]3Delivery of teachingIt deals with the ways and quality of the teaching.[7]4EfficiencyHow quickly the required services are available.[1][6][2][7]5FunctionalityIt describes that what specific tasks can be performed by using the system[1][8]6Information qualityBy what level the available information suits the user.[9][10]7InteroperabilityAccess to multiple service[4]8PrivacyThe level to which a person is secure in performing his tasks without being public.[7][6][1]9Response timeThe time between the request and availability of the information[2]10SecuritySecurity factor reflects the adequacy of security features implemented in the ITES.[7][1][4]11Service reliabilityService reliability is the percentage of time the ITES is available for use without failure.[5] 2No.Mean ofCount Responses%1Paper10101002Web Link500398803Skype Text2010504Google Talk5030605Phone call60406764036907966467Total77658875 3No.Mean of Sending SurveyCountResponsesAverage Response1Paper Survey10101.7%2Web Link50039867.6%3Skype Text Request20101.7%4Google Talk Link Forwarding50305.1%5Phone call Requests60406.8%6Text message Requests40366.12%7Facebook messaging966410.8% 4MeasureNumberConfidence Level99%Confidence Interval3Population accessed776Sample Size548percentage50*The actual population size is unknown [9] 5SomewhatVeryAverage WeightedItemsPoorAcceptableAcceptableGoodExcellentResponseAccessibility0422704921353.60Customization04897215843153.41Delivery of teaching324252504903.59Efficiency0362165161803.72Functionality0302783842553.67Information quality3362345041653.65Interoperability342331384903.78Privacy3363414921053.88Response time3242614561653.65Security042243420903.53Service reliability0242343361653.67Service usability0242973482253.68Site design062345521203.75System integrity0362794561053.56Trust042405348753.37Usefulness0302973722103.65User support0362523482253.68 6ItemsPoorSomewhat acceptable Acceptable Very Good Excellent Average Weighted ResponseAccessibility12102531312753.17Customization0126540228752.94Delivery of teaching42120531156152.69Efficiency91324952161203.06Functionality6138504288753.06Information quality21724143841503.24Interoperability30964322641203.02Privacy24138177120751.70Response time91084772881353.17Security181324682281203.01Service reliability18138468204752.92Service usability24108441288602.98Site design21144477192902.91System integrity18180450168602.81Trust61504322641353.10Usefulness18144495204752.92User support12144468228902.99 7Year 201619© 2016 Global Journals Inc. (US) 8Year 201620 9Year 201622 © 2016 Global Journals Inc. (US) The authors thank Prince Sultan University in the Kingdom of Saudi Arabia for funding this project in the year 2013-2014 under number IBRP-CFW-2013-11-14. * A proposed instrument dimensions for measuring e-government service quality MAAlanezi AKamil SBasri International Journal of u-and e-Service 3 2010 * Conceptual Framework for understanding e-service quality: Implications for future research and managerial practice VAZeithaml AParasuraman AMalhotra 2000 * Development and validation of an instrument to measure user perceived service quality of information presenting web portals ZYang SCai ZZhou NZhou Information & Management 42 2005 * ITmediated customer service content and delivery in electronic governments: An empirical investigation of the antecedents of service quality C.-WTan IBenbasat RTCenfetelli MIS quarterly 37 2013 * A conceptual framework for understanding and measuring perceived service quality in net-based customer support systems LBurgess CollECTeR LatAm Conference Santiago, Chile 2004 * ES-QUAL a multiple-item scale for assessing electronic service quality AParasuraman VAZeithaml AMalhotra Journal of service research 7 2005 * Impact of service quality dimensions in internet banking on customer satisfaction AGeorge GGKumar 2014 41 DECISION * Determining the relative importance of mobile banking quality factors H.-FLin Computer Standards & Interfaces 35 2013 * A factor and structural equation analysis of the enterprise systems success measurement model DSedera GGable ICIS 2004 Proceedings 2004 36 * Measuring the quality of e-service: Scale development and initial validation SISwaid RTWigand Journal of Electronic Commerce Research 10 2009 * Factors for Measurement of ITES Quality for Higher Education Institutions in Saudi Arabia BShahzad RAziz Global Journal of Computer Science and Technology 15 2015 * Utilizing Technology in Education Environment: A Case Study BShahzad EAlwagait Information Technology: New Generations (ITNG), 2013 Tenth International Conference on 2013 * Taking risky opportunities in youthful content creation: teenagers' use of social networking sites for intimacy, privacy and selfexpression SLivingstone New media & society 2008 10 * Mixed method research: Fundamental issues of design, validity, and reliability in construction research DAAbowitz TMToole Journal of Construction Engineering and Management 136 2009 * Toward a conceptual framework for mixed-method evaluation designs JCGreene VJCaracelli WFGraham Educational evaluation and policy analysis 1989 11 * Cultural differences in responses to a Likert scale JWLee PSJones YMineyama XEZhang Research in nursing & health 25 2002 * Research Design: Qualitative, Quantitative, and Mixed Methods Approaches JWCreswell 2009 Sage Publications * Qualitative research practice: A guide for social science students and researchers: Sage JRitchie JLewis CMNicholls ROrmston 2013 * Pragmatism vs interpretivism in qualitative information systems research G European Journal of Information Systems 21 2012 * The qualitative research interview SQQu JDumay Qualitative Research in Accounting & Management 8 2011 * E-mail interviewing in qualitative research: A methodological discussion LIMeho Journal of the American society for information science and technology 57 2006 * Analyzing interview data: The development and evolution of a coding system CWeston TGandell JBeauchamp LMcalpine CWiseman CBeauchamp Qualitative sociology 24 2001 * A qualitative assessment of Arab culture and information technology transfer CEHill KDLoch DStraub KEl-Sheshai Journal of Global Information Management (JGIM) 6 1998 * A method of analysing interview transcripts in qualitative research PBurnard Nurse education today 11 1991 * KSU News Portal: A Case Study YAl-Ohali AAAl-Oraij BShahzad International Conference on Internet Computing (ICOMP'11) 2011 * Does a Change in Weekend Days Have an Impact on Social Networking Activity? BShahzad AAlwagait Journal of Universal Computer Science 20 2015 * Impact of social media usage on students academic performance in Saudi Arabia EAlwagait BShahzad SAlim Computers in Human Behavior 51 2015 * Impact of Change in Weekend Days on Social Networking Culture in Saudi Arabia BShahzad EAlwagait SAlim 2014 International Conference on Future Internet of Things and Cloud 2014 * Software Risk Management-Prioritization of frequently occurring Risk in Software Development Phases BShahzad J 2nd International Conference on Information and Communication Technology (ICICT2007) 2007 Using Relative Impact Risk Model * Distributed risk analysis using relative impact technique BShahzad JIqbal ZHaq SRaza 3rd Asian Conference on Intelligent Systems and Networks 2006 * Enhanced risk analysis-relative impact factorization BShahzad TAfzal RIrfan Information and Communication Technologies, 2005. ICICT 2005. First International Conference on 2005 * AM SBasit Shahzad Application of Quantitative Research Methods in Identifying Software Project