Visible to the public Computational Intelligence

SoS Newsletter- Advanced Book Block

Computational Intelligence


Computational Intelligence

  • Lavania, S.; Darbari, M.; Ahuja, N.J.; Siddqui, IA, "Application of computational intelligence in measuring the elasticity between software complexity and deliverability," Advance Computing Conference (IACC), 2014 IEEE International , vol., no., pp.1415,1418, 21-22 Feb. 2014 doi: 10.1109/IAdCC.2014.6779533 Abstract: The paper highlights various issues of complexity and deliverability and its impact on software popularity. The use of Expert Intelligence system helps us in identifying the dominant and non-dominant impediments of software. FRBS is being developed to quantify the trade-off between complexity and deliverability issues of a software system.
    Keywords: {computational complexity;expert systems;software quality;FRGS;computational intelligence;dominant impediments;elasticity measurement;expert intelligence system;nondominant impediments;software complexity;software deliverability;software popularity;Conferences;Decision support systems;Handheld computers;Complexity;Deliverability;Expert System}, (ID#:14-2762)
    URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6779533&isnumber=6779283
  • Yannakakis, G.N.; Togelius, J., "A Panorama of Artificial and Computational Intelligence in Games," Computational Intelligence and AI in Games, IEEE Transactions on , vol.PP, no.99, pp.1,1 doi: 10.1109/TCIAIG.2014.2339221 Abstract: This paper attempts to give a high-level overview 4of the field of artificial and computational intelligence (AI/CI) in games, with particular reference to how the different core research areas within this field inform and interact with each other, both actually and potentially. We identify ten main research areas within this field: NPC behavior learning, search and planning, player modeling, games as AI benchmarks, procedural content generation, computational narrative, believable agents, AI-assisted game design, general game artificial intelligence and AI in commercial games. We view and analyze the areas from three key perspectives: (1) the dominant AI method(s) used under each area; (2) the relation of each area with respect to the end (human) user; and (3) the placement of each area within a human-computer (player-game) interaction perspective. In addition, for each of these areas we consider how it could inform or interact with each of the other areas; in those cases where we find that meaningful interaction either exists or is possible, we describe the character of that interaction and provide references to published studies, if any. We believe that this paper improves understanding of the current nature of the game AI/CI research field and the interdependences between its core areas by providing a unifying overview. We also believe that the discussion of potential interactions between research areas provides a pointer to many interesting future research projects and unexplored subfields.
    Keywords: {Artificial intelligence;Computational modeling;Evolutionary computation;Games;Planning;Seminars}, (ID#:14-2763)
    URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6855367&isnumber=4804729
  • Myers, AJ.; Megherbi, D.B., "An efficient computational intelligence technique for affine-transformation-invariant image face detection, tracking, and recognition in a video stream," Computational Intelligence and Virtual Environments for Measurement Systems and Applications (CIVEMSA), 2014 IEEE International Conference on , vol., no., pp.88,93, 5-7 May 2014 doi: 10.1109/CIVEMSA.2014.6841444 Abstract: While there are many current approaches to solving the difficulties that come with detecting, tracking, and recognizing a given face in a video sequence, the difficulties arising when there are differences in pose, facial expression, orientation, lighting, scaling, and location remain an open research problem. In this paper we present and perform the study and analysis of a computationally efficient approach for each of the three processes, namely a given template face detection, tracking, and recognition. The proposed algorithms are faster relatively to other existing iterative methods. In particular, we show that unlike such iterative methods, the proposed method does not estimate a given face rotation angle or scaling factor by looking into all possible face rotations or scaling factors. The proposed method looks into segmenting and aligning the distance between two eyes' pupils in a given face image with the image x-axis. Reference face images in a given database are normalized with respect to translation, rotation, and scaling. We show here how the proposed method to estimate a given face image template rotation and scaling factor leads to real-time template image rotation and scaling corrections. This allows the recognition algorithm to be less computationally complex than iterative methods.
    Keywords: {face recognition;image sequences;iterative methods;video signal processing;affine-transformation-invariant image;computational intelligence technique;face detection;face image template;face recognition;face tracking;iterative methods;reference face images;video sequence;video stream;Databases;Face;Face recognition;Histograms;Lighting;Nose;Streaming media;computational intelligence;detection;facial;machine learning;real-time;recognition;tracking;video}, (ID#:14-2764)
    URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6841444&isnumber=6841424
  • Antoniades, A; Took, C.C., "A Google approach for computational intelligence in big data," Neural Networks (IJCNN), 2014 International Joint Conference on , vol., no., pp.1050,1054, 6-11 July 2014 doi: 10.1109/IJCNN.2014.6889469 Abstract: With the advent of the emerging field of big data, it is becoming increasingly important to equip machine learning algorithms to cope with volume, variety, and velocity of data. In this work, we employ the MapRe-duce paradigm to address these issues as an enabling technology for the well-known support vector machine to perform distributed classification of skin segmentation. An open source implementation of MapReduce called Hadoop offers a streaming facility, which allows us to focus on the computational intelligence problem at hand, instead of focusing on the implementation of the learning algorithm. This is the first time that support vector machine has been proposed to operate in a distributed fashion as it is, circumventing the need for long and tedious mathematical derivations. This highlights the main advantages of MapReduce - its generality and distributed computation for machine learning with minimum effort. Simulation results demonstrate the efficacy of MapReduce when distributed classification is performed even when only two machines are involved, and we highlight some of the intricacies of MapReduce in the context of big data.
    Keywords: {Big Data;distributed processing;learning (artificial intelligence);pattern classification;public domain software;support vector machines;Google approach;MapReduce;big data;computational intelligence;distributed classification;machine learning algorithms;open source Hadoop;skin segmentation;streaming facility;support vector machine;Big data;Context;Machine learning algorithms;Skin;Support vector machines;Testing;Training}, (ID#:14-2765)
    URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6889469&isnumber=6889358
  • Sharif, N.; Zafar, K.; Zyad, W., "Optimization of requirement prioritization using Computational Intelligence technique," Robotics and Emerging Allied Technologies in Engineering (iCREATE), 2014 International Conference on , vol., no., pp.228,234, 22-24 April 2014 doi: 10.1109/iCREATE.2014.6828370 Abstract: Requirement Engineering (RE) is considered as an important part in Software Development Life Cycle. It is a traditional Software Engineering (SE) process. The goal of RE is to Identify, Analyze, Document and Validate requirements. Requirement Prioritization is a crucial step towards making good decisions about product plan but it is often neglected. It is observed that in many cases the product is considered as a failure without proper prioritization because it fails to meet its core objectives. When a project has tight schedule, restricted resources, and customer expectations are high then it is necessary to deploy the most critical and important features as early as possible. For this purpose requirements are prioritized. Several requirement prioritization techniques have been presented by various researchers over the past years in the domain of SE as well as Computational Intelligence. A new technique is presented in this paper which is a hybrid of both domains named as FuzzyHCV. FuzzyHCV is a hybrid of Hierarchical Cumulative Voting (HCV) and Fuzzy Expert System. Comparative analysis is performed between new technique and an existing HCV technique. Result shows that proposed technique has proved to be more reliable and accurate.
    Keywords: {expert systems;fuzzy set theory;software engineering;statistical analysis;FuzzyHCV technique;RE;SE process;computational intelligence technique;fuzzy expert system;hierarchical cumulative voting;requirement engineering;requirement prioritization techniques;software development life cycle;software engineering process;Computers;Documentation;Expert systems;Fuzzy systems;Software;Software engineering;Fuzzy HCV;Fuzzy systems;HCV;Requirement prioritization}, (ID#:14-2766)
    URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6828370&isnumber=6828323
  • Alvares, Marcos; Marwala, Tshilidzi; de Lima Neto, Fernando Buarque, "Application of Computational Intelligence For Source Code Classification," Evolutionary Computation (CEC), 2014 IEEE Congress on, vol., no., pp.895, 902, 6-11 July 2014. doi: 10.1109/CEC.2014.6900300 Multi-language Source Code Management systems have been largely used to collaboratively manage software development projects. These systems represent a fundamental step in order to fully use communication enhancements by producing concrete value on the way people collaborate to produce more reliable computational systems. These systems evaluate results of analyses in order to organise and optimise source code. These analyses are strongly dependent on technologies (i.e. framework, programming language, libraries) each of them with their own characteristics and syntactic structure. To overcome such limitation, source code classification is an essential preprocessing step to identify which analyses should be evaluated. This paper introduces a new approach for generating content-based classifiers by using Evolutionary Algorithms. Experiments were performed on real world source code collected from more than 200 different open source projects. Results show us that our approach can be successfully used for creating more accurate source code classifiers. The resulting classifier is also expansible and flexible to new classification scenarios (opening perspectives for new technologies).
    Keywords: {Algorithm design and analysis;Computer languages;Databases;Genetic algorithms;Libraries;Sociology;Statistics}, (ID#:14-2767)
    URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6900300&isnumber=6900223

Note:

Articles listed on these pages have been found on publicly available internet pages and are cited with links to those pages. Some of the information included herein has been reprinted with permission from the authors or data repositories. Direct any requests via Email to SoS.Project (at) SecureDataBank.net for removal of the links or modifications to specific citations. Please include the ID# of the specific citation in your correspondence.