1. Introduction
Objective world consists of five factors which are MATTER, ENERGY, INFORMATION, SPACE and TIME. Among these factors, there are many inner or outer laws of causation called objective laws, which can also be called as objective logic. Corresponding to objective laws, there is a mirror image in machine of artificial intelligence, which can be called subjective laws, and also can be called as subjective logic. The machine of artificial intelligence only holds objective laws through subjective laws, or with another word, only holds objective logic through subjective logic. Subjective logic is built on infinite times circulation by the three-step named Sensation, Abstraction and Thinking. Sensation is input-ability to machine from objective information with sensor. Abstraction is conception-classification ability to sensed information and to name it. Thinking is the ability of machine researching from conception-classification information to get the subjective logic.
Dialectical logic K-model would supply a computation-idea for machine so that the machine is able to think by dialectical logic method [1] [2] , so an important information-treated method would be the dialectical logic. In this paper now, author has built a mathematical model for dialectical logic, and the model combines with three main identification technologies (identification technology of graph, identification technology of sound, identification technology of written language), and data-base technology and machine-self-programming technology will take a great progress to artificial intelligence.
2. Axiom System and Theorems
2.1. Conception and the Conception-Dimension
2.1.1. The Conception
The conception is the name what be named the object-thing’s main property and neglected its secondary property. Conception is the basis of artificial intelligence.
2.1.2. The conception-Dimension
The conception-dimension is the dimension-number of conception. For example, HORSE is an one-dimension conception; WHITE HORSE is a two-dimension conception; RUNNING WHITE HORSE is a three-dimension conception; ONE RUNNING WHITE HORSE is a four-dimension conception; etc.
2.1.3. Associate-Data-Base
The three main identification technologies will setup three data-base, although there are other data-base, for example physical data-base and chemical data- base, etc. These three data-bases are associated by the conception-dimension.
2.2. Several Laws
2.2.1. Definitions
Denote objective-demain by B, and subjective-demain by A; define research- arithmetic
(research-arithmetic is an arithmetic to get subjective-logic from objective-logic) and inverse research-arithmetic
(inverse research-arithmetic is an arithmetic to make subjective-logic to objective-logic to check its true or false) and thus
Then
As above,
is the j-th cycle research-arithmetic, obviously
is a logic-transformation between demain A and demain B. Sometimes can denote
by F. Symbol
is denote the object logic and the subject logic are mirror imagine each other.
Denote subjective-logic-variable by
as above, t is time-variable, right superscript
denote the logic-variable contains n couples of contradicting-subvariable
, n is called the rank-number of
, right subscript i denote the i-th of causation law.
is denoted the true-valued-function of logic-variable
, and
.
Denote the mirror-image in objective-demain of
by
.
2.2.2. Objective-Researchable Law
To every object logic-variable
, there must always exist logic-transformation F, make
2.2.3. Research Error Alternating-Convergence Law
To every correct logic-transformation F, always make
i.e. to every finite
, error-function
In the formulation as above
, if
; or
, if
,
In fact, incorrect logic-transformation
can bring the researching divergence. The faster is the convergence of
, the brighter is the
.
2.2.4. Logic-Variable Energy Conservation Law
For logic-variable
, its inner producing-energy
is equal to the consumed work
by its contradiction-function (see the 10.4 contradiction-function as below), i.e.
2.2.5. Mozi-Principle (Minimax Principle) [3]
Logic-variable
in changing, must be satisfied or must be satisfied asymptotically by that cost-function
is minimum and gain-function
is maximum. Denote the pure gain-function by
, i.e.
or
2.2.6. Memory-Inertia Law
1) Last-time memory law
For time-sequence
, denote the memory effect-weight-function of contradiction-subvariable
from time t by
,
and
If and only if
. Memory prefer the
of the last time.
2) Importance memory law
For importance-sequence
, and
So the memory prefer the
, if and only if its
is bigger.
3) Bigger probability memory law
For property event
, their corresponding probability are
,
and
so the memory prefer
.
2.3. Machine Self-Programmable and Self-Correctable Law
In the researching process the machine must have ability to self-programmable and self-correctable without the operations by human beings.
2.4. Forbidden Law
There are two kinds of logic: one is dialectical-logic corresponding to intelligence quotient and another is imagine-logic corresponding to emotional quotient.
The imagine-logic would be forbidden into machine, because the imagine-logic will make machine to emotional quotient so that the autonomous-mind will belong to the machine. The artificial intelligence with the autonomous-mind will not like to be “a tool” for human beings, so “a new creation” will be created, of cause this is harmful to human beings.
2.5. Logic-Variable Infinite-Separable-Characteristic Law
2.6. Logical Inductive and Deductive Method Theorem
For cycle research-arithmetic
, make
1) (inductive) if as above is true to finite
2) (hypothesis) suppose as above is true to
,
3) (deductive) so that as above is true to all
.
Proof: through finite to prove infinite, there always exists error-function based on the 2.3. research error alternating-convergence law, if the 6.2. (hypothesis) is false, then producing the false to the 6.3. (deductive). Thus the truth to every step of
would be carefully checked.
Proof is over.
2.7. Definitions of Logic Algorithm
For
, denote their true-valued-function by corresponding to
,
2.7.1. Logic
(OR Arithmetic)
Definition
logic
is satisfied by as below
Commutative law
associative law
2.7.2. Logic
(AND Arithmetic)
Definition
logic
is satisfied by as below commutative law
associative law
2.7.3. Logic Hybrid Arithmetic (OR & AND)
Logic hybrid arithmetic is satisfied as below
Distributive law
2.7.4. Logic N (NOT Arithmetic)
, if
is positive in front of the 1; if
is negative in front of the 1.
Denote
Arithmetic N is satisfied by as below idempotent law
, if
;
, if
;
2.8. De Morgan’s Theorem
(1)
(2)
Proof:
Formulation (1) proof is over.
Formulation (2) proof is over.
Remark 1: As above when time t is degenerated into a constant and destroy the contradictions in the formulations, then the formulations will be degenerated into the mathematical model of formal logic, i.e. Boolean algebra.
2.9. Logic Algorithm True-Valued-Function Composition Theorem
For logic arithmetic of true-valued-function
, as below there is a composition theorem:
If equation
exist root-set
, and
so that
as same as above, also
If equation
in which the possible augment root and complex root will be removed out.
If
, so the root-set
that
or
in which select a maximum or minimum of pure function.
Proof: obviously based on definition of logic arithmetic.
Proof is over.
2.10. Kirchoff’s Power-Function Law and Kirchoff’s Flow-Function Law [4]
Defines a connecting directed simple graph
with
nodes and E edges
In graph
every node will be given a power-function
; for node
and node
, if edge
, then on the edge
exists a power-function
(3)
In Formulation (3), lower reaches node is left and upper reaches node is right, in opposite direction would sign a negative in the front of
.
In graph
if the edge
exists, then give a flow-function
, its direction is from the upper reaches node to the lower reaches node. The graph
would be satisfied by 10.2., 10.3., 10.4., 10.5. as below.
2.10.1. Kirchoff’s Power-Function Law
In the graph
every cycle
is satisfied by
2.10.2. Kirchoff’s Flow-Function Law
In the graph
every node
is satisfied by
as above input flow
is positive and output flow
is negative.
2.10.3. Contradiction-Function
Defines
is the contradiction-function of dge
in graph
.
2.10.4. Work and Energy Law
In the graph
every edge
, make
is called what work done in
by contradiction-function
on edge
.
2.11. Structure in Graph
of Logic-Variable
(Figure 1)
1) Graph
have n couples of nodes and two nodes
, total
;
2) Graph
is a no-loop, no-multiple edge directed simple graph;
3) In the graph
, n + 1 positive nodes construct a perfect subgraph, another n + 1 negative node construct another perfect subgraph;
4) Node
connect only to node
, positive node
connect only to negative node
what right subscript is equal.
5) The power-function of node
is a constant +1,the power-function of
is a constant −1, the power-function of edge
is a constant +2, the flow-function of edge
is a constant I;
6) In the graph
, other nodes and edges all be defined power-function
, power-function
, flow-function
and contradiction-function
, these are satisfied by Kirchhoff laws as above.
2.12. Heredity and Variation Theorem
1) For
, if the 1-Order derivative of function
exist and if
then
is called variation, or not is called heredity.
2) Every contradiction-function
always can be seen an algebra-sum of a constant C and function
i.e.
(4)
and if
then in Formulation (4)
is variation and C is heredity.
3) Heredity and variation theorem
Heredity and variation is basic law in object-logic of cause in subject-logic.
Prove: combining the 11.1. and 11.2. as above which can be proved obviously.
2.13. The Critical-Point Theorem
For the contradiction-function
, if its m-order derivative exist and the points as below make
and some special-selected points are called critical-point.
Property-Function Critical-Point Theorem
The existence of critical-points will make what some new property-function is born or some old property-function is dead.
2.14. Isomorphic-Equality ↔
If the rank-number of logic-variable
is equal i.e.
, then
and
is called isomorphic-equality, denote isomorphic-equality as
Isomorphic-equality ↔ is satisfied by as below reflexive law
symmetrical law if
, then
transtive law if
and
, then
2.14.1. Similarity between Logic-Variable
and
, Thinking Analogy Method
1) Definition: the two logic-variable
and
is called similarity, if
a.
;
b. in the graph
and
, their contradiction-function of corresponding edge is proportion i.e.
in formulation as above C is a constant.
2) If
and
is called similarity, what is denoted as
. The relation is satisfied as below reflexive law
symmetrical law if
, then
transtive law if
, and
, then
3) Thinking analogy theorem
Two of logic?variable
and
are analogy-able in thinking, if and only if
Proof: based on the definition of similarity, if
then
In formulation as above, C is a constant, thus they are analogy-able in thinking.
Proof is Over.
3. Conclusion
As shown above, author has established an axiom system depending on several laws, some definitions, graph GK and Mozi-principle, and proved some theorems for dialectical logic K-model. The advanced properties and theorems for dialectical logic K-model will be explained in succedent papers by the author.