Showing posts with label software. Show all posts
Showing posts with label software. Show all posts

Monday, 6 October 2014

How to configure Windows Server 2008 for DSE





   This article is about the configuration of Windows Server 2008 with Kerberos authentication. Kerberos is an integral part of Windows 2008 Active Directory implementations, and anyone planning to deploy and maintain the enterprise NoSQL database e.g. DataStax Enterprise must should have a basic knowledge of the principals and administrative issues involved in this front-line security technology.
   If we want to configure Windows Server 2008 with Kerberos authentication we need to install: Web Server(IIS), DHCP Server, Active Directory Domain Server. Also, we should set static ip address, computer name, and configure all installed services. The following section will explain how to properly configure system for Kerebros authentication and install the necessary software.

Wednesday, 17 September 2014

How to configure Linux for DataStax Enterpise


   This article is about configuration Ubuntu (Linux) for DataStax Enetrpise. DataStax Enterprise is a big data platform built on Apache Cassandra that manages real-time, analytics, and enterprise search data. DataStax Enterprise leverages Cassandra, Apache Hadoop, and Apache Solr to shift focus from the data infrastructure to using data strategically.
   For a Ubuntu (Linux) configuration for DataStax Enetrpise we need to install: OpenSSH server, Java, JCE, curl. Also, we should set static IP address and hostname, making it easier to identify resources on a network. The following section will explain how to properly configure system for name resolution using DNS, static hostname records and install the necessary software.

Monday, 4 August 2014

Use Cognitum ASP.NET Providers for Cassandra (Role, Membership, Session, Profile) in your application.


1.      About

My name is Karol and  I would like to show you how to implement ASP.NET providers (Role, Membership, Session, Profile) using Cassandra as a database. All necessary providers you can find on Cognitum ASP.NETProviders for Cassandra. What is important, you can easily use Cassandra Providers in your application to store providers’ data on Cassandra.
The solution implements Membership, Role, Profile and Session-State Store Providers. ASP.NET Membership enables you to manage user information and validate for your Web application. Role Provider mapping users to roles and provide methods to manage roles. Profile Provider expands information about user for new property. Session-State Provider serialize and deserialize session-state data and store them. Information about this providers you can find at MSDN:

2.      See solution

Requirements: 

2.1.  Run application

You can run my application using MVC package application or source code.

    MVC application package

Requirements: 

Tuesday, 13 May 2014

Simple form design (and more…) using Fluent Editor.

At Cognitum we are developing innovative business applications making use of semantic technologies that can dramatically improve company productivity. Today we would like to present a method using our technologies with which it is possible to build a complex C#-Silverlight business application using a combination of Fluent Editor and Ontorion. We will start by a simple example to better explain the potential of our approach. Then we will present a more realistic example presenting how this approach can be applied to solve common business problems.

Simple form design (and more...) using Fluent Editor.



Let’s imagine we want to make a Customer form containing a customer section with compulsory fields First-Name and Surname.  First of all we code the characteristics of the form using our Controlled Natural Language editor: Fluent Editor, obtaining the code:

Every form is a thing.
Every customer-form is a form.

Every customer-form must concern a customer.

Every customer must have-first-name (some-string-value).
Every customer must have-surname (some-string-value).

our C# application will then render the form as:


As we can see the name of the form is taken from (name)-form, the name of the section is the element concerning the form and the name of the field is taken from have-(name). Moreover as the fields have been declared in the CNL file with the must keyword, submitting the form without specifying these fields will result in an error.
Using this schema it is possible to construct all type of form fields. For example the code

Every customer must have-customer-type a customer-type.
Something is a customer-type if-and-only-if-it is either Already-Client or New-Client.


Will result in a field of type select


It is also possible to have optional checkbox fields using can or additional requirements if a certain choice is made. For example by adding the CNL code

Something is a already-client-customer if-and-only-if-it has-customer-type Already-Client.
Every already-client-customer must have-phone-number (some-integer-value).

Then after choosing the Already-Client option in the form, we will be asked for the phone-number of the customer.
Clearly using such method if we will in the future decide that a field, a field type or a field name have to be changed we can do this immediately by changing the content of the CNL file.
Another interesting consequence of using a CNL file to render the form is that after submitting the form we can easily obtain a CNL-compliant result. For example by filling out the form presented before and submitting it we will obtain a CNL code of the type

Customer-Form-0 is a customer-form.
Customer-1 is a customer.
Customer-1 is a already-client-customer.
Customer-1 has-first-name equal-to 'John'.
Customer-1 has-surname equal-to 'Dow'.
Customer-1 has-customer-type Already-Client.
Customer-1 has-phone-number equal-to '394829388'.
Customer-Form-0 concerns Customer-1.


Going yet one step further we can add to the model some simple rules that we would like the elements of the form to satisfy

Every package is-for a customer-type.
 
If a customer have-customer-type a thing and a package is-for the thing then the package is-package-for the customer.

And a simple instance of the package type:

Already-Client-Package is a package.
Already-Client-Package is-for Already-Client.

Then using Ontorion reasoner inside Fluent Editor, it is possible to ask questions like :  ‘’Who or What is-package-for Customer-1?” and the answer will be: Already-Client-Package. This can also be done automatically by the C# application using the MicroOntorion API. At this point the full potential of this method is unveiled. We will show in the next part how using this kind of approach it is possible to make a complex business application powered by Cognitum’s semantic technologies with a simple user interface intuitive to use.


If you want to learn more about Fluent Editor CNL-EN grammar, visit this link.

*) FluentEditor 2, ontology editor, is a comprehensive tool for editing and manipulating complex ontologies that uses Controlled Natural Language. Fluent editor provides one with a more suitable for human users alternative to XML-based OWL editors. It's main feature is the usage of Controlled English as a knowledge modeling language. Supported via Predictive Editor, it prohibits one from entering any sentence that is grammatically or morphologically incorrect and actively helps the user during sentence writing. The Controlled English is a subset of Standard English with restricted grammar and vocabulary in order to reduce the ambiguity and complexity inherent in full English.

Tuesday, 4 June 2013

Does your application understand you? Hello World with microOntorion SDK.

Nature likes symmetry. If you interact with someone, usually you expect the interaction from the other side. The same situation is in the case of application. A huge effort is put on developing programmes, that will not only process the input, but also understand it (or at least try to understand). That is the point of ontology matters and semantic technology.
Now, as a programmer, you can participate in the semantic world of applications. This post will show how to use ontology in your application with help of microOntorion.

microOntorion SDK

microOntorion is an end point to Ontorion environment. It makes all reasoning locally, on your computer. It is provided as a .dll library.

Before you start working on your semantic application, you need to add reference to microOntorion library.

Using directives may be helpful.
using Ontorion.MicroOntorion;
using Ontorion;

Initialization


Lets create in Main function microOntorion object, that allows to import ontology and query against it:

MicroOntorion oep = new MicroOntorion();

Now, we can load ontology. The source ontology should be prepared as CNL sentences. The easiest way to do this is to use Fluent Editor 2. It has auto-complete feature that ensures you that your ontology and queries are valid grammar sentences. Fluent Editor 2 writes ontology as *.encnl files, which can you stream directly to microOntorion.
There is also possibility to construct sentences as string and load in that form to microOntorion library.
We will utilize the first method.
Lets construct simple ontology (in Fluent Editor 2) and save as myOntology.encnl file.


Comment: 'Sample IT ontology'.

Server-1 is a server and hosts Application-1.
Server-2 is a server and hosts Application-2.

Server-1 has-ip-address equal-to '173.194.70.102'.
Server-1 has-ip-address equal-to '173.194.70.103'.
Server-1 has-ip-address equal-to '173.194.70.104'.

Server-2 has-ip-address equal-to '206.190.36.45'.

Application-1 is an application that serves Customer-1 and serves Customer-2.
Application-3 is an application that serves Customer-3.

Application-1 has-name equal-to 'Fluent Editor'.
Application-1 has-name equal-to 'Fluent Editor 2'.
Application-3 has-name equal-to 'Ontorion'.

Customer-1 is a customer and has-severity critical.
Customer-2 is a customer and has-severity medium.
Customer-3 is a customer and has-severity low.

X is-hosted-on Y if-and-only-if Y hosts X.
Every application must be-hosted-on server.

Part-2: 'Incidents'.
Incident-1 has-reported-date equal-to 2012-01-01 and was-reported-by Operator-1.
Incident-1 has-affected Server-1.

Incident-2 has-reported-date equal-to 2012-01-02 and was-reported-by Operator-1.
Incident-2 has-affected Application-2.


Now we can import our ontology to microOntorion:

try
{
    using (FileStream ontologyFileStream = new FileStream("../../../myOntology.encnl", FileMode.Open, FileAccess.Read))
    {
        // load ontology from file
        microOntorion.Load(ontologyFileStream);
    }
}
catch (Ontorion.ConsistencyException e)
{
    // when exception has been thrown check if knowledge has been incosistent
    foreach (var expl in microOntorion.GetExplanations())
    {
        PrintResults(expl);
    }
    return;
}
catch (Exception e)
{
    return;
}

Function Load makes also some preprocessing and your application processes ontology only once.
Function GetExplanations provides some information about sources of errors, that occurred due to ontology preprocessing (e.g. ontology is inconsistent).
Function PrintResults just prints results:

private static void PrintResults(List<string> result)
{
    string res = "";
    foreach (var item in result)
    {
        res += item + " ";
    }
    Console.WriteLine(res);
}

Asking query

At this moment we have create MicroOntorion object and ontology is loaded. It is time to ask some question. Lets prepare it:

// build your query
string query = String.Format("Who-Or-What is-hosted-on server that has-ip-address equal-to '173.194.70.102' ?");

Get results of this query:

// get superconcepts returned by the query
List<string> superconcepts = microOntorion.GetSuperconceptsOf(query);
Console.Write("Superconcepts: ");
PrintResults(superconcepts);

// get superconcepts returned by the query
List<string> subconcepts = microOntorion.GetSubconceptsOf(query);
Console.Write("Subconcepts: ");
PrintResults(subconcepts);

// get all instances returned by the query
List<string> instances = microOntorion.GetInstances(query, int.MaxValue);
Console.Write("Instances: ");
PrintResults(instances);

GetSubconceptsOf, GetSuperconceptsOf and GetInstances returns subconcepts, superconcepts and instances respectively that satisfy query.

Attributes

We can also ask for attributes of specified instances, e.g. Application-1, that were returned from the previous function. Application-1 has name attribute. Lets get this attribute:

// get names of the applications (as attributes)
foreach (var item in instances)
{
    Console.WriteLine("{0} has name(s): {1}", item, string.Join(", ", microOntorion.GetAttributeValues(item, "have-name").ToArray()));
}

OWLAPI has some problems with extracting attributes from complex sentences. It is recommended to attach attributes to instances in separate sentence.

Requirements

There is also possibility to ask for requirements. Lets get all requirements for application concept.

// get all modalities for application concept.
var res = microOntorion.GetRequirements("application");
Console.WriteLine("Requirements for application concept:");
foreach (var item in res)
{
    Console.WriteLine("- {0}", item.Key);
    foreach (var req in item.Value)
    {
        Console.WriteLine("--- {0}", req);
    }
}

MicroOntorion.GetRequirements supports now only simple requirements:
Every <C> <modality><R><D>
where:
<C> is concept, e.g. application,
<modality> is modality such as must, should, ... etc.,
<R> is role, e.g. be-hosted-on,
<D> is any ending of sentence, can be quite complex

The result of querying our ontology:

Superconcepts:
Subconcepts:
Instances: Application-1
Application-1 has name(s): Fluent Editor 2, Fluent Editor
Requirements for application concept:
- MUST
--- is-hosted-on a server
Press any key to continue . . .

Whole source code:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Ontorion.MicroOntorion;
using System.IO;
using Ontorion;

namespace Sample
{
    class Program
    {
        static void Main(string[] args)
        {
            MicroOntorion microOntorion = new MicroOntorion();
            try
            {
                using (FileStream ontologyFileStream = new FileStream("../../../myOntology.encnl", FileMode.Open, FileAccess.Read))
                {
                    // load ontology from file
                    microOntorion.Load(ontologyFileStream);

                    // build your query
                    string query = String.Format("Who-Or-What is-hosted-on server that has-ip-address equal-to '173.194.70.102' ?");

                    // get superconcepts returned by the query
                    List<string> superconcepts = microOntorion.GetSuperconceptsOf(query);
                    Console.Write("Superconcepts: ");
                    PrintResults(superconcepts);

                    // get superconcepts returned by the query
                    List<string> subconcepts = microOntorion.GetSubconceptsOf(query);
                    Console.Write("Subconcepts: ");
                    PrintResults(subconcepts);

                    // get all instances returned by the query
                    List<string> instances = microOntorion.GetInstances(query, int.MaxValue);
                    Console.Write("Instances: ");
                    PrintResults(instances);


                    // get names of the applications (as attributes)
                    foreach (var item in instances)
                    {
                        Console.WriteLine("{0} has name(s): {1}", item, string.Join(", ", microOntorion.GetAttributeValues(item, "have-name").ToArray()));
                    }

                    // get all modalities for application concept.
                    var res = microOntorion.GetRequirements("application");
                    Console.WriteLine("Requirements for application concept:");
                    foreach (var item in res)
                    {
                        Console.WriteLine("- {0}", item.Key);
                        foreach (var req in item.Value)
                        {
                            Console.WriteLine("--- {0}", req);
                        }
                    }

                }
            }
            catch (Ontorion.ConsistencyException e)
            {
                // when exception has been thrown check if knowledge has been incosistent
                foreach (var expl in microOntorion.GetExplanations())
                {
                    PrintResults(expl);
                }
                return;
            }
            catch (Exception e)
            {
                return;
            }
        }

        private static void PrintResults(List<string> result)
        {
            string res = "";
            foreach (var item in result)
            {
                res += item + " ";
            }
            Console.WriteLine(res);
        }
    }
}

---

You can download microOntorion SDK here.



*) FluentEditor 2, ontology editor, is a comprehensive tool for editing and manipulating complex ontologies that uses Controlled Natural Language. Fluent editor provides one with a more suitable for human users alternative to XML-based OWL editors. It's main feature is the usage of Controlled English as a knowledge modeling language. Supported via Predictive Editor, it prohibits one from entering any sentence that is grammatically or morphologically incorrect and actively helps the user during sentence writing. The Controlled English is a subset of Standard English with restricted grammar and vocabulary in order to reduce the ambiguity and complexity inherent in full English.

Thursday, 18 April 2013

Automated testing in the cloud

Overview

Modern enterprise organization usually maintains at least one web application or Intranet system.  Critical issue for organization is to ensure that critical business functions will be available to customers, suppliers, regulators, and other entities that must have access to those functions.
Maintaining business continuity requires system test execution, in particular functional tests, performance tests, stress tests and continuously monitoring of services. The traditional approach to testing emerges a number of problems and the need to incur the necessary costs. Requires the ongoing commitment and maintain the validation team, infrastructure, tools and licenses to plan and execute testing and reporting, as in any organization meets the requirements of a limited budget, tight deadlines to provide tested solutions. If you also count the cost of a single test, the number of tests needed for a full test cycle, the need for regression testing, poor reusability and the lack of testing in a distributed environment with multiple locations then we find that is not possible to carry out some tests using traditional methods. Human resources and infrastructure costs are too high. Maintaining system continuity requires performing continuous actions for testing team, in particular functional tests, performance tests, stress tests and continuously monitoring of services.


Solution

Testing platform is the solution for all these problems. Let's imagine that you have a team of hundreds or even thousand validation engineers. Then imagine that your team executes for all day and the night, during several weeks, a lot of testing scenarios using various operating systems, browsers from distributed localization. Cognitum provides flexible automated cloud testing platform based on Microsoft Azure. Platform shifts testing of applications into virtual infrastructure and simulates real world user traffic from different location, operating systems, browsers and test cases.  Scalability of infrastructure gives possibility to employee hundreds or even thousand virtual computers on demand. Flexibility manages number and configurations of those virtual machines. Distributed testing environment allows simulating users, maintaining business continuity and executing almost all testing types as a cost-effective solution. The solution is scalable according to the needs of the company, to the maximum capability of infrastructure, database or Azure cost plan.

Customer’s issue

Blue chip company from energy sector which has a number of internal systems, websites and web portal for their customers. The company must deliver continuous operation of internal systems for proper operation of the accounting department, electric grid monitoring, HR system and the client department. In addition, continuous monitoring is required website, because informs about company, prices and urgent messages about failures, and network maintenance. In other site, the company has system for individual and business customers, which allows them to log on anywhere and control over their bills and payments.
In case of internal systems testing services we are using virtual network (based on Microsoft Azure), which allows secure access to the local network. Connecting via VPN gateway provides corporate data security, which is very important for any organization. Thus, our test platform has access to a live system and data in a production environment which cause solution more reliable and effective.
In case of websites and web applications, testing platform can simulate a massive user’s traffic, examine continuous system availability, and perform regression testing after each update.
Moreover, the company require duplicate production environment for testing purposes. Applications and system are moved and launched as a testing environment in the cloud. For security reasons, sensitive data are anonymized and reproduced by a statistical model of production data. In this environment, the testing can be performed using traditional methods, but using a testing platform in the cloud, both methods can be combined together. Test platform running in the cloud carries previously designed test scenarios. Execution of the test scenarios is managed by the special tool called Test Manager. In this way, the test scenarios can perform testing on duplicated applications in a virtual environment. It completely frees the company from having a physical test infrastructure.

Benefits

The implemented solution has brought a new quality to the issue of testing web and enterprise infrastructure. Based on test platform in the cloud, the organization has reduced employment of validation team and the total cost of system’s maintenance.


Cognitum cooperates with Microsoft under prestigious Azure Circle program, where technology partners are invited with experience in Windows Azure. It provides IT solutions in the area of Cloud and BigData for customers both in Poland and abroad. 
 

Wednesday, 23 January 2013

The Program For A Machine

The computer program is a specification of the machine behaviour. To specify it, one can describe how the machine should behave to realize the specified task. If it is specified what the machine should do, then we can say that the computer program is written in imperative manner rather than declarative. Declarative manner specifies the goal in a formal way, leaving the realization to general and powerful automatic  process that will invent the optimal way to approach it.

In the 1930s, Church and Turing proposed different ideas for a formal system. Lambda Calculi and Touring Machine , which were ultimately proven to be logically equivalent, are nowadays recognized as precursors of the two main families of the programming languages: functional and imperative. The von Neumann architecture, that is a model of modern computer, implements a universal Turing machine. Imperative programming languages were the first that started the evolution and now they are the oldest well known ones. Imperative programming describes the computation in terms of sequence of statements that can change a machine state - in other words, program written in imperative language is a specification for a sequence of commands. What is more, the possibility of changing the state of the machine is the key feature here. The oldest imperative programming language that is still used nowadays is FORTRAN, created in 1954. The rest of the history of imperative languages is as follows: BASIC (1964), Pascal (1970), C (1972), Ada (1978), Smalltalk (1980), C++ (1985), PHP (1994), Java (1994), C# (2002).

Good programming language has to keep pace with the progress of technology and at the same time it must respond to market needs.
Object-oriented languages, the novel offspring of imperative languages, lead programmer to use objects as main conceptual constructs employed to build virtual world, which can be easily understood by a human brain. A well written program in object-oriented programming language, models algorithms and data, uses concepts and relationships between them and raises them step-by-step to the next levels of abstraction. At the same time it ensures that these concepts have the properties of physical objects. This approach is similar to the process of ontology engineering based on virtual objects. Simula (1967) is generally accepted as the first language to support the primary features of an object-oriented language. Another example is C++, a general purpose language developed as an offspring of C, that is equipped (with some limitations) with object-oriented abilities. Platform independ-ence was the motor for Java and C# language, the modern object-oriented languages with the highest impact in the field nowadays.


Spectrum of Computer Languages

Object technology, invented in 1965 in a lab of the University of Oslo (together with the Simula Language), is built around three basic concepts: instance, class and superclass, and two basic relations: instance-of and inherits-from.  In the 80's the exact meaning of these relations was widely discussed, including the con-troversies between single and multiple-inheritance. Modern object-oriented languages go beyond those basic concepts by providing more or less general meta-class organization schemes (a class being itself an instance of a metaclass). As a consequence, when we nowadays talk about an object (the instance of a class), the context is (the most) important. In a general context, we usually mean an entity cor-responding to the common scheme, but if we need to be more specific, we refer to a C# object, a Java object, a C++ object, an Eiffel object, a CLOS (Common Lisp Object System) object, etc., with respect to their additional properties.

Lisp was the first programming language that used the approach taken from Church’s lambda calculus. Functional languages are nowadays adapted to the common mainstream, as they were also processors of modern programming languages. Going back in the history of computer languages, the debates in the late 1960s and early 1970s (about declarative versus procedural representations of knowledge in artificial intelligence) resulted in an introduction of the novel family of programming languages: the logical ones. Logical programming is regarded as separate from functional languages as far as functional language is still focused around functions – recursive concepts that realize the task. One of logical languages widely used in computer science is SQL, which is based on well behaved Codd’s relational algebra, and has a counterpart in logic. PROLOG implements the Horn Logic. The mixture of PROLOG and SQL resulted in DATALOG language – subset of PROLOG oriented to databases.

Lets now take a look at esotheric ones :)  -> http://en.wikipedia.org/wiki/Esoteric_programming_language