Posts by Tomas Brada

1) Message boards : News : Autonomous Subproject (Message 729)
Posted 29 Jan 2019 by Tomas Brada
Post:
Miss Natalia
This project still has at least 117109 tasks ready to send and new ones are being generated. Stopping it right now would be a waste and kill the experiment that the tasks are belonging to. This forum would be deleted too. It it possible that ice00 is not retired and has to work or study and look after the servers in his spare time IDK.
2) Message boards : News : Autonomous Subproject (Message 723)
Posted 28 Jan 2019 by Tomas Brada
Post:
Is it ready to run in the BOINC project?
As I understand it, recompilation is required.

Yes, It is ready to run in boinc. I compiled the exe in windows for windows 64 bit.
It is possible to compile it for win32. Compilation in windows is complicated, because there are so many things that do not work straight away. It works better in Linux. I should upload linux binaries too, or ice00.
3) Message boards : News : Autonomous Subproject (Message 721)
Posted 28 Jan 2019 by Tomas Brada
Post:
With the boinc app compiled for windows, it will be possible to run it in boinc project. ice00 should have a look!
4) Message boards : News : Autonomous Subproject (Message 720)
Posted 28 Jan 2019 by Tomas Brada
Post:
Publishing file odlk-progs-padlsboinc-win64.tar.xz http://www.tbrada.eu/up/8ad0ab24f7d58d368ec0325f06ec6799409675f3.tar.xz
Open with WinRar, WinZip, 7Zip.

Programs from odlk-progs repository compiled for windows 64bit. Tomáš Brada All rights reserved, do not distribute or use outside of Autonomous Subproject of Natalia Makarova and associated boinc project. This software is provided AS IS with NO WARRANTY.

Source code is freely available at https://github.com/gridcoin-community/odlk-progs . You can compile and edit the programs yourself for whatever system and distribute the products with respect to licence included at the github repository.

Libraries used by Belyshev.

psevdoass.exe

BOINC Application for the PADLS Experiment. Generators used are by Natalia Makarova. Not multithreaded.

PADLS Experiment Boinc App. Intended to be run from boinc. No help messages and minimal command line options. Input is a command line, 8 numbers (0-9), considered to be a Row for the experiment. Output is file output.txt, but physical name of that file is resolved by boinc. Progress reporting is supported.

Checkpointing and resume support is not supported currently.

* Written by: Tomáš Brada
* Libraries: Belyshev
* Leader: Natalia Makarova

family_mar.exe

Multithreaded version of family_mar program. Searces for Fancy DLS.

Expected 2 arguments
Search for Fancy by Family Diagonal Latin Squares (not symmetric).
family_mar.exe input output
 input : file to read Latin Squares from
 output: file to write Fancy Diagonal Latin Squares to


postprocess.exe

Based on zamyk.bat from Belyshev. Finds additional CF ODLKs from the ones already in input file. It basically mimiscs the operation of zamyk.bat up to
line 38. No uniqueness check is perormed, the results are simply appended to output file. This code should have been part of the boinc app, but for some reason was not. This program is multi-threaded. It will use as many cpu cores as possible.

Command line: postprocess.exe input output

Input is read from input file, output is APPENDED to output file. The output file is not overwritten.

ortogencnt.exe

This is basically ortogen.exe combined with izvl.exe and type_count.exe. Additionally it checks for uniqueness and discards duplicate squares from input.
This program is multithreaded too.

ortogoncnt.exe -wco input
 -w : write sorted and unique cf odls back to input file
 -c : count cf odls grouped by number of their co-squares
 -o : write out_ortogon.txt and out_kf_N.txt files
 input : database of fancy diagonal latin squares


CF ODLSs are read from input file. Duplicates are discarded. If -w option was set, the input file is rewritten with sorted and unique squares. If -c option was set, statistics are printes to STDOUT in form like this:

Found Fancy CF:
count[1]: 129066
count[2]: 197
count[3]: 1
All: 129267
Found CF co-squares: 129412


If the -o option was set, out_ortogon.txt out_kf_X.txt and out_kf_mates.txt files will be written. If such files exist, they will be overwritten.
5) Message boards : News : Autonomous Subproject (Message 716)
Posted 27 Jan 2019 by Tomas Brada
Post:
Natalia Makarova
Please remember that I am doing this just because I find it interesting.

Good news: I finished coding the boinc app for padls! But there is some tweaking needed to required libraries and try to compile it for windows.
6) Message boards : Number crunching : The processing of results (Message 702)
Posted 22 Jan 2019 by Tomas Brada
Post:
Do you think it is worth the hassle to introduce new format?

Btw: good compression ratio archived! To encode arbitrary (non-latin) 10x10 square of 0-9 it would take 56 characters. This is only 25.
7) Message boards : News : Autonomous Subproject (Message 701)
Posted 22 Jan 2019 by Tomas Brada
Post:
You have a great output format!
# Generator=generator_lk_4_31_31 family=IAEDFXNKO
# Generator=generator_lk_4_31_31 family=IAEDFXONK
# Generator=generator_lk_4_31_31 family=IAEDFXOQH
0 6 4 2 7 3 8 9 5 1
3 1 9 8 6 7 2 5 0 4
6 5 2 4 1 9 0 3 7 8
8 7 1 3 5 0 4 6 9 2
2 9 7 1 4 8 3 0 6 5
4 3 8 6 9 5 7 1 2 0
9 0 3 5 2 4 6 8 1 7
1 8 6 9 0 2 5 7 4 3
5 2 0 7 3 1 9 4 8 6
7 4 5 0 8 6 1 2 3 9

Looking at the source of programs (klpmd/ortogon...), i see that feeding them this file directly will cause problems. The numbers in the comment (4 3 1 3 1 ) will be taken as part of the next square. I would like to make sure, that you removed the comments before processing the files.
8) Message boards : Number crunching : The processing of results (Message 693)
Posted 22 Jan 2019 by Tomas Brada
Post:
db unique kf dlk: 5329884
Processing rez_ODLK_May-dec_F2.txt : 385041
rez_ODLK_May-dec_F2.txt -: in: 385041, not lk: 0, not dlk: 0, dup: 0, new uniqe kf dlk: 385041                           
db unique kf dlk: 5714925

No duplicates with respect to the 5329884 database. 385041 new. Now 5714925 in the database.
9) Message boards : News : Autonomous Subproject (Message 692)
Posted 22 Jan 2019 by Tomas Brada
Post:
The version of family_mar in my repository is multi-threaded. It just needs to be compiled for your system. For example on my system:
git clone --recurse-submodules https://github.com/gridcoin-community/odlk-progs.git
cd odlk-progs
cmake -DCMAKE_BUILD_TYPE=Release .
make

For windows system example: https://github.com/tudelft3d/masbcpp/wiki/Building-on-Windows-with-CMake-and-MinGW
10) Message boards : Number crunching : The processing of results (Message 685)
Posted 21 Jan 2019 by Tomas Brada
Post:
The program works across all files. It goes like this: first file is read, the cf dls are added to memory set. Then the second file is read, each square is converted to cf dlk and checked against the memory set if it already exists there, if not that means it is unique and is added to memory set, if it already is in the memory set, this duplicate is reported and not added again.

rez_ODLK1_Oct_F2.txt -: in: 404478 not lk: 0 not dlk: 0 dup: 2 new uniqe kf dlk: 404476

rez_ODLK1_Oct_F2.txt - input file
in: 404478 - 404478 squares in input file
not lk: 0 - no squares failed the is_lk() check
not dlk: 0 - no squares failed is_dlk() check
dup: 2 - two duplicates
new uniqe kf dlk: 404476 - 404476 cf dls added to memory set

That means there are two KF DLK which already are in at least one of the previous files.
11) Message boards : Number crunching : The processing of results (Message 681)
Posted 21 Jan 2019 by Tomas Brada
Post:
The report shows first which file it is processing and how many squares were in the file, then if duplicate is found, it will show the square as in the file and if that square is not diagonal cf, then also kf of the duplicate.
12) Message boards : Number crunching : The processing of results (Message 680)
Posted 21 Jan 2019 by Tomas Brada
Post:
I reordered the files and this is the new report.
Processing DB_500000.txt : 500000
DB_500000.txt -: in: 500000 not lk: 0 not dlk: 0 dup: 0 new uniqe kf dlk: 500000
db unique kf dlk: 500000
Processing DB_part2_500000.txt : 500000
DB_part2_500000.txt -: in: 500000 not lk: 0 not dlk: 0 dup: 0 new uniqe kf dlk: 500000
db unique kf dlk: 1000000
Processing DB_part3_500000.txt : 500000
DB_part3_500000.txt -: in: 500000 not lk: 0 not dlk: 0 dup: 0 new uniqe kf dlk: 500000
db unique kf dlk: 1500000
Processing DB_part4_500000.txt : 500000
DB_part4_500000.txt -: in: 500000 not lk: 0 not dlk: 0 dup: 0 new uniqe kf dlk: 500000
db unique kf dlk: 2000000
Processing rez_ODLK1_may_F1.txt : 447612
rez_ODLK1_may_F1.txt -: in: 447612 not lk: 0 not dlk: 0 dup: 0 new uniqe kf dlk: 447612
db unique kf dlk: 2447612
Processing rez_ODLK1_Jun_F1.txt : 486061
rez_ODLK1_Jun_F1.txt -: in: 486061 not lk: 0 not dlk: 0 dup: 0 new uniqe kf dlk: 486061
db unique kf dlk: 2933673
Processing rez_ODLK1_Jul_F1.txt : 407150
rez_ODLK1_Jul_F1.txt -: in: 407150 not lk: 0 not dlk: 0 dup: 0 new uniqe kf dlk: 407150
db unique kf dlk: 3340823
Processing rez_ODLK1_aug.txt : 400553
rez_ODLK1_aug.txt -: in: 400553 not lk: 0 not dlk: 0 dup: 0 new uniqe kf dlk: 400553
db unique kf dlk: 3741376
Processing add_rez_9294.txt : 9294
duplicate (orig) :
0 1 2 3 4 5 6 7 8 9
1 2 3 7 5 8 9 4 6 0
8 6 9 4 1 0 5 3 7 2
5 9 0 1 6 4 7 2 3 8
6 0 5 8 7 2 3 1 9 4
3 7 4 9 0 6 2 8 5 1
9 4 7 5 2 3 8 0 1 6
2 3 1 6 8 9 4 5 0 7
7 8 6 2 3 1 0 9 4 5
4 5 8 0 9 7 1 6 2 3

kf (lin=54) of the duplicate:
0 4 5 7 3 8 9 2 6 1
8 1 7 9 0 3 4 5 2 6
5 6 2 0 1 7 8 3 9 4
1 2 4 3 6 9 7 0 5 8
9 7 3 8 4 6 2 1 0 5
3 9 6 2 8 5 0 4 1 7
7 3 0 5 9 1 6 8 4 2
6 8 9 4 5 2 1 7 3 0
2 0 1 6 7 4 5 9 8 3
4 5 8 1 2 0 3 6 7 9

duplicate (orig) :
0 1 2 3 4 5 6 7 8 9
1 2 3 7 5 8 9 6 4 0
8 7 4 2 9 3 0 5 1 6
4 8 5 9 0 6 2 3 7 1
5 3 8 4 6 1 7 0 9 2
9 6 0 1 3 7 5 4 2 8
7 5 6 8 2 0 1 9 3 4
2 9 7 0 1 4 3 8 6 5
3 0 1 6 8 9 4 2 5 7
6 4 9 5 7 2 8 1 0 3

kf (lin=53) of the duplicate:
0 5 4 2 7 3 9 8 6 1
6 1 3 8 0 9 5 4 2 7
8 7 2 4 5 0 1 3 9 6
9 6 0 3 2 7 4 5 1 8
1 3 7 0 4 6 8 9 5 2
2 4 9 1 8 5 7 6 0 3
5 2 8 9 3 1 6 0 7 4
3 9 5 6 1 8 2 7 4 0
4 0 6 7 9 2 3 1 8 5
7 8 1 5 6 4 0 2 3 9

duplicate (orig) :
0 1 2 3 4 5 6 7 8 9
1 2 3 7 6 9 0 8 4 5
8 6 1 9 5 4 7 0 3 2
6 0 9 8 3 2 1 4 5 7
5 7 6 4 9 3 8 2 1 0
3 4 7 5 2 6 9 1 0 8
2 8 5 6 7 0 4 3 9 1
9 3 8 1 0 7 2 5 6 4
4 5 0 2 8 1 3 9 7 6
7 9 4 0 1 8 5 6 2 3

kf (lin=44) of the duplicate:
0 2 6 9 5 8 7 4 3 1
4 1 9 5 7 6 8 3 2 0
3 6 2 7 8 9 1 0 4 5
5 0 8 3 6 1 4 2 9 7
8 9 3 1 4 7 5 6 0 2
2 4 7 0 9 5 3 1 6 8
1 7 4 8 2 0 6 9 5 3
9 8 5 6 0 3 2 7 1 4
7 3 0 4 1 2 9 5 8 6
6 5 1 2 3 4 0 8 7 9

duplicate (orig) :
0 1 2 3 4 5 6 7 8 9
1 2 3 7 6 9 4 8 5 0
4 8 6 0 5 3 9 2 1 7
2 5 9 8 7 6 3 4 0 1
7 3 1 6 9 8 5 0 4 2
8 6 4 5 1 7 0 9 2 3
3 7 0 4 8 2 1 6 9 5
9 4 7 1 3 0 2 5 6 8
5 0 8 9 2 4 7 1 3 6
6 9 5 2 0 1 8 3 7 4

kf (lin=65) of the duplicate:
0 7 8 6 2 9 3 5 4 1
6 1 5 0 8 3 9 4 2 7
7 6 2 8 5 4 1 3 9 0
4 9 7 3 1 8 0 6 5 2
5 3 1 9 4 6 7 2 0 8
2 0 4 1 7 5 8 9 6 3
9 2 3 5 0 1 6 8 7 4
1 8 9 2 6 0 4 7 3 5
3 4 0 7 9 2 5 1 8 6
8 5 6 4 3 7 2 0 1 9

duplicate (orig) :
0 1 2 3 4 5 6 7 8 9
1 2 3 7 6 9 8 5 4 0
5 7 4 0 8 6 9 2 1 3
2 5 9 6 7 1 3 4 0 8
4 8 7 2 9 0 5 3 6 1
8 3 0 4 5 7 2 1 9 6
3 9 5 8 0 4 1 6 7 2
6 0 1 9 2 3 7 8 5 4
9 6 8 5 1 2 4 0 3 7
7 4 6 1 3 8 0 9 2 5

kf (lin=0) of the duplicate:
0 2 3 5 6 4 7 8 9 1
2 1 8 7 3 9 4 6 0 5
9 7 2 8 0 1 5 3 4 6
5 4 6 3 8 7 2 9 1 0
3 8 1 9 4 6 0 2 5 7
6 0 9 1 7 5 8 4 2 3
1 5 7 4 9 2 6 0 3 8
4 3 5 0 1 8 9 7 6 2
7 9 0 6 2 3 1 5 8 4
8 6 4 2 5 0 3 1 7 9

duplicate (orig) :
0 1 2 3 4 5 6 7 8 9
1 2 3 8 0 9 5 4 7 6
3 9 5 7 2 4 8 6 0 1
2 5 0 4 1 8 3 9 6 7
5 4 6 9 7 2 1 0 3 8
8 7 4 1 5 6 0 3 9 2
6 3 7 0 8 1 9 5 2 4
9 6 1 2 3 0 7 8 4 5
7 8 9 5 6 3 4 2 1 0
4 0 8 6 9 7 2 1 5 3

kf (lin=0) of the duplicate:
0 2 3 5 6 4 7 8 9 1
2 1 4 7 5 9 3 6 0 8
6 0 2 8 1 7 9 3 4 5
9 6 8 3 0 1 2 5 7 4
7 8 9 1 4 6 5 2 3 0
4 3 1 9 7 5 8 0 6 2
1 7 0 4 8 2 6 9 5 3
3 4 5 2 9 8 0 7 1 6
5 9 6 0 2 3 1 4 8 7
8 5 7 6 3 0 4 1 2 9

duplicate (orig) :
0 1 2 3 4 5 6 7 8 9
1 2 3 8 5 6 7 9 4 0
4 6 9 1 7 2 5 8 0 3
8 3 7 5 0 9 1 6 2 4
9 8 4 7 6 3 2 0 5 1
7 9 5 0 2 8 3 4 1 6
3 0 8 6 9 1 4 2 7 5
6 5 0 4 3 7 8 1 9 2
2 7 6 9 1 4 0 5 3 8
5 4 1 2 8 0 9 3 6 7

kf (lin=65) of the duplicate:
0 2 3 4 5 7 8 6 9 1
3 1 4 6 9 0 5 8 2 7
1 5 2 0 6 9 7 3 4 8
2 7 8 3 1 4 0 9 6 5
9 0 7 8 4 6 1 2 5 3
4 8 1 9 7 5 2 0 3 6
7 9 0 5 3 8 6 4 1 2
5 6 9 2 8 1 3 7 0 4
6 4 5 7 2 3 9 1 8 0
8 3 6 1 0 2 4 5 7 9

duplicate (orig) :
0 1 2 3 4 5 6 7 8 9
1 2 3 8 5 6 9 0 4 7
9 8 7 2 0 3 5 6 1 4
5 0 8 6 1 4 7 3 9 2
7 4 6 1 9 8 3 2 5 0
4 7 5 9 3 1 0 8 2 6
3 9 4 0 6 2 8 5 7 1
8 6 1 5 7 9 2 4 0 3
6 5 0 4 2 7 1 9 3 8
2 3 9 7 8 0 4 1 6 5

kf (lin=58) of the duplicate:
0 9 8 7 6 4 5 2 3 1
6 1 3 2 8 7 9 5 0 4
7 4 2 6 0 8 1 3 9 5
8 5 1 3 9 0 4 6 7 2
1 7 5 0 4 6 3 9 2 8
3 6 4 9 7 5 2 8 1 0
9 3 0 8 1 2 6 4 5 7
2 0 9 4 5 1 8 7 6 3
4 2 7 5 3 9 0 1 8 6
5 8 6 1 2 3 7 0 4 9

duplicate (orig) :
0 1 2 3 4 5 6 7 8 9
1 2 3 8 6 9 4 0 5 7
5 9 4 7 0 8 2 3 6 1
4 5 9 6 2 3 7 8 1 0
8 3 0 9 7 6 5 1 2 4
7 0 6 4 8 1 3 2 9 5
3 8 5 0 1 2 9 4 7 6
6 7 1 2 9 0 8 5 4 3
9 4 8 1 5 7 0 6 3 2
2 6 7 5 3 4 1 9 0 8

kf (lin=1) of the duplicate:
0 2 3 6 5 4 8 9 7 1
4 1 7 5 8 9 3 6 0 2
1 8 2 9 0 7 4 3 6 5
9 7 1 3 6 0 2 5 4 8
8 3 9 2 4 6 1 0 5 7
2 4 0 8 7 5 9 1 3 6
7 0 5 4 2 1 6 8 9 3
6 5 8 1 9 3 0 7 2 4
3 9 6 7 1 2 5 4 8 0
5 6 4 0 3 8 7 2 1 9

duplicate (orig) :
0 1 2 3 4 5 6 7 8 9
1 2 3 9 6 8 4 0 5 7
4 7 6 2 5 1 9 3 0 8
8 0 4 5 2 9 7 1 6 3
9 3 8 1 7 6 5 2 4 0
7 6 5 8 0 3 1 4 9 2
5 9 0 4 3 2 8 6 7 1
6 4 1 7 8 0 3 9 2 5
3 8 7 0 9 4 2 5 1 6
2 5 9 6 1 7 0 8 3 4

kf (lin=65) of the duplicate:
0 2 3 4 5 7 8 6 9 1
3 1 8 6 9 0 5 4 2 7
1 5 2 0 6 9 7 3 4 8
2 7 4 3 1 8 0 9 6 5
9 0 7 8 4 6 1 2 5 3
4 8 1 9 7 5 2 0 3 6
7 9 0 5 3 4 6 8 1 2
5 6 9 2 8 1 3 7 0 4
6 4 5 7 2 3 9 1 8 0
8 3 6 1 0 2 4 5 7 9

add_rez_9294.txt -: in: 9294 not lk: 0 not dlk: 0 dup: 10 new uniqe kf dlk: 9284
db unique kf dlk: 3750660
Processing addition_results_67879.txt : 67879
addition_results_67879.txt -: in: 67879 not lk: 0 not dlk: 0 dup: 0 new uniqe kf dlk: 67879
db unique kf dlk: 3818539
Processing rez_ODLK1_Sep_F2.txt : 407995
rez_ODLK1_Sep_F2.txt -: in: 407995 not lk: 0 not dlk: 0 dup: 0 new uniqe kf dlk: 407995
db unique kf dlk: 4226534
Processing rez_ODLK1_Oct_F2.txt : 404478
duplicate (orig) :
0 2 3 4 5 7 8 6 9 1
6 1 5 8 3 9 7 4 0 2
5 4 2 6 0 1 9 3 7 8
7 6 4 3 9 8 2 1 5 0
3 8 0 7 4 6 1 9 2 5
9 3 1 2 7 5 0 8 4 6
8 7 9 5 1 2 6 0 3 4
2 0 8 9 6 4 5 7 1 3
1 9 6 0 2 3 4 5 8 7
4 5 7 1 8 0 3 2 6 9

duplicate (orig) :
0 3 4 8 5 2 9 6 7 1
3 1 5 4 6 9 7 2 0 8
5 0 2 6 9 8 1 3 4 7
7 8 1 3 0 4 2 9 6 5
2 5 3 7 4 6 8 1 9 0
1 6 0 9 7 5 4 8 3 2
8 7 9 5 3 1 6 0 2 4
9 4 8 1 2 3 0 7 5 6
6 9 7 2 1 0 5 4 8 3
4 2 6 0 8 7 3 5 1 9

rez_ODLK1_Oct_F2.txt -: in: 404478 not lk: 0 not dlk: 0 dup: 2 new uniqe kf dlk: 404476
db unique kf dlk: 4631010
Processing rez_ODLK1_Nov_F2.txt : 355634
rez_ODLK1_Nov_F2.txt -: in: 355634 not lk: 0 not dlk: 0 dup: 0 new uniqe kf dlk: 355634
db unique kf dlk: 4986644
Processing rez_ODLK1_Dec_F2.txt : 343240
rez_ODLK1_Dec_F2.txt -: in: 343240 not lk: 0 not dlk: 0 dup: 0 new uniqe kf dlk: 343240
db unique kf dlk: 5329884
13) Message boards : Number crunching : The processing of results (Message 678)
Posted 21 Jan 2019 by Tomas Brada
Post:
There are 12 KF ODLK duplicates from which 10 are literal. See this report:
Processing DB_500000.txt : 500000
DB_500000.txt -: in: 500000 not lk: 0 not dlk: 0 dup: 0 new uniqe kf dlk: 500000
db unique kf dlk: 500000
Processing DB_part2_500000.txt : 500000
DB_part2_500000.txt -: in: 500000 not lk: 0 not dlk: 0 dup: 0 new uniqe kf dlk: 500000
db unique kf dlk: 1000000
Processing DB_part3_500000.txt : 500000
DB_part3_500000.txt -: in: 500000 not lk: 0 not dlk: 0 dup: 0 new uniqe kf dlk: 500000
db unique kf dlk: 1500000
Processing DB_part4_500000.txt : 500000
DB_part4_500000.txt -: in: 500000 not lk: 0 not dlk: 0 dup: 0 new uniqe kf dlk: 500000
db unique kf dlk: 2000000
Processing addition_results_67879.txt : 67879
addition_results_67879.txt -: in: 67879 not lk: 0 not dlk: 0 dup: 0 new uniqe kf dlk: 67879
db unique kf dlk: 2067879
Processing add_rez_9294.txt : 9294
add_rez_9294.txt -: in: 9294 not lk: 0 not dlk: 0 dup: 0 new uniqe kf dlk: 9294
db unique kf dlk: 2077173
Processing rez_ODLK1_may_F1.txt : 447612
duplicate (orig) :
0 1 2 3 4 5 6 7 8 9
1 2 3 7 6 9 4 8 5 0
4 8 6 0 5 3 9 2 1 7
2 5 9 8 7 6 3 4 0 1
7 3 1 6 9 8 5 0 4 2
8 6 4 5 1 7 0 9 2 3
3 7 0 4 8 2 1 6 9 5
9 4 7 1 3 0 2 5 6 8
5 0 8 9 2 4 7 1 3 6
6 9 5 2 0 1 8 3 7 4

kf (lin=65) of the duplicate:
0 7 8 6 2 9 3 5 4 1
6 1 5 0 8 3 9 4 2 7
7 6 2 8 5 4 1 3 9 0
4 9 7 3 1 8 0 6 5 2
5 3 1 9 4 6 7 2 0 8
2 0 4 1 7 5 8 9 6 3
9 2 3 5 0 1 6 8 7 4
1 8 9 2 6 0 4 7 3 5
3 4 0 7 9 2 5 1 8 6
8 5 6 4 3 7 2 0 1 9

duplicate (orig) :
0 1 2 3 4 5 6 7 8 9
1 2 3 8 5 6 7 9 4 0
4 6 9 1 7 2 5 8 0 3
8 3 7 5 0 9 1 6 2 4
9 8 4 7 6 3 2 0 5 1
7 9 5 0 2 8 3 4 1 6
3 0 8 6 9 1 4 2 7 5
6 5 0 4 3 7 8 1 9 2
2 7 6 9 1 4 0 5 3 8
5 4 1 2 8 0 9 3 6 7

kf (lin=65) of the duplicate:
0 2 3 4 5 7 8 6 9 1
3 1 4 6 9 0 5 8 2 7
1 5 2 0 6 9 7 3 4 8
2 7 8 3 1 4 0 9 6 5
9 0 7 8 4 6 1 2 5 3
4 8 1 9 7 5 2 0 3 6
7 9 0 5 3 8 6 4 1 2
5 6 9 2 8 1 3 7 0 4
6 4 5 7 2 3 9 1 8 0
8 3 6 1 0 2 4 5 7 9

duplicate (orig) :
0 1 2 3 4 5 6 7 8 9
1 2 3 8 5 6 9 0 4 7
9 8 7 2 0 3 5 6 1 4
5 0 8 6 1 4 7 3 9 2
7 4 6 1 9 8 3 2 5 0
4 7 5 9 3 1 0 8 2 6
3 9 4 0 6 2 8 5 7 1
8 6 1 5 7 9 2 4 0 3
6 5 0 4 2 7 1 9 3 8
2 3 9 7 8 0 4 1 6 5

kf (lin=58) of the duplicate:
0 9 8 7 6 4 5 2 3 1
6 1 3 2 8 7 9 5 0 4
7 4 2 6 0 8 1 3 9 5
8 5 1 3 9 0 4 6 7 2
1 7 5 0 4 6 3 9 2 8
3 6 4 9 7 5 2 8 1 0
9 3 0 8 1 2 6 4 5 7
2 0 9 4 5 1 8 7 6 3
4 2 7 5 3 9 0 1 8 6
5 8 6 1 2 3 7 0 4 9

duplicate (orig) :
0 1 2 3 4 5 6 7 8 9
1 2 3 9 6 8 4 0 5 7
4 7 6 2 5 1 9 3 0 8
8 0 4 5 2 9 7 1 6 3
9 3 8 1 7 6 5 2 4 0
7 6 5 8 0 3 1 4 9 2
5 9 0 4 3 2 8 6 7 1
6 4 1 7 8 0 3 9 2 5
3 8 7 0 9 4 2 5 1 6
2 5 9 6 1 7 0 8 3 4

kf (lin=65) of the duplicate:
0 2 3 4 5 7 8 6 9 1
3 1 8 6 9 0 5 4 2 7
1 5 2 0 6 9 7 3 4 8
2 7 4 3 1 8 0 9 6 5
9 0 7 8 4 6 1 2 5 3
4 8 1 9 7 5 2 0 3 6
7 9 0 5 3 4 6 8 1 2
5 6 9 2 8 1 3 7 0 4
6 4 5 7 2 3 9 1 8 0
8 3 6 1 0 2 4 5 7 9

rez_ODLK1_may_F1.txt -: in: 447612 not lk: 0 not dlk: 0 dup: 4 new uniqe kf dlk: 447608
db unique kf dlk: 2524781
Processing rez_ODLK1_Jun_F1.txt : 486061
duplicate (orig) :
0 1 2 3 4 5 6 7 8 9
1 2 3 7 5 8 9 4 6 0
8 6 9 4 1 0 5 3 7 2
5 9 0 1 6 4 7 2 3 8
6 0 5 8 7 2 3 1 9 4
3 7 4 9 0 6 2 8 5 1
9 4 7 5 2 3 8 0 1 6
2 3 1 6 8 9 4 5 0 7
7 8 6 2 3 1 0 9 4 5
4 5 8 0 9 7 1 6 2 3

kf (lin=54) of the duplicate:
0 4 5 7 3 8 9 2 6 1
8 1 7 9 0 3 4 5 2 6
5 6 2 0 1 7 8 3 9 4
1 2 4 3 6 9 7 0 5 8
9 7 3 8 4 6 2 1 0 5
3 9 6 2 8 5 0 4 1 7
7 3 0 5 9 1 6 8 4 2
6 8 9 4 5 2 1 7 3 0
2 0 1 6 7 4 5 9 8 3
4 5 8 1 2 0 3 6 7 9

duplicate (orig) :
0 1 2 3 4 5 6 7 8 9
1 2 3 7 5 8 9 6 4 0
8 7 4 2 9 3 0 5 1 6
4 8 5 9 0 6 2 3 7 1
5 3 8 4 6 1 7 0 9 2
9 6 0 1 3 7 5 4 2 8
7 5 6 8 2 0 1 9 3 4
2 9 7 0 1 4 3 8 6 5
3 0 1 6 8 9 4 2 5 7
6 4 9 5 7 2 8 1 0 3

kf (lin=53) of the duplicate:
0 5 4 2 7 3 9 8 6 1
6 1 3 8 0 9 5 4 2 7
8 7 2 4 5 0 1 3 9 6
9 6 0 3 2 7 4 5 1 8
1 3 7 0 4 6 8 9 5 2
2 4 9 1 8 5 7 6 0 3
5 2 8 9 3 1 6 0 7 4
3 9 5 6 1 8 2 7 4 0
4 0 6 7 9 2 3 1 8 5
7 8 1 5 6 4 0 2 3 9

duplicate (orig) :
0 1 2 3 4 5 6 7 8 9
1 2 3 7 6 9 8 5 4 0
5 7 4 0 8 6 9 2 1 3
2 5 9 6 7 1 3 4 0 8
4 8 7 2 9 0 5 3 6 1
8 3 0 4 5 7 2 1 9 6
3 9 5 8 0 4 1 6 7 2
6 0 1 9 2 3 7 8 5 4
9 6 8 5 1 2 4 0 3 7
7 4 6 1 3 8 0 9 2 5

kf (lin=0) of the duplicate:
0 2 3 5 6 4 7 8 9 1
2 1 8 7 3 9 4 6 0 5
9 7 2 8 0 1 5 3 4 6
5 4 6 3 8 7 2 9 1 0
3 8 1 9 4 6 0 2 5 7
6 0 9 1 7 5 8 4 2 3
1 5 7 4 9 2 6 0 3 8
4 3 5 0 1 8 9 7 6 2
7 9 0 6 2 3 1 5 8 4
8 6 4 2 5 0 3 1 7 9

duplicate (orig) :
0 1 2 3 4 5 6 7 8 9
1 2 3 8 0 9 5 4 7 6
3 9 5 7 2 4 8 6 0 1
2 5 0 4 1 8 3 9 6 7
5 4 6 9 7 2 1 0 3 8
8 7 4 1 5 6 0 3 9 2
6 3 7 0 8 1 9 5 2 4
9 6 1 2 3 0 7 8 4 5
7 8 9 5 6 3 4 2 1 0
4 0 8 6 9 7 2 1 5 3

kf (lin=0) of the duplicate:
0 2 3 5 6 4 7 8 9 1
2 1 4 7 5 9 3 6 0 8
6 0 2 8 1 7 9 3 4 5
9 6 8 3 0 1 2 5 7 4
7 8 9 1 4 6 5 2 3 0
4 3 1 9 7 5 8 0 6 2
1 7 0 4 8 2 6 9 5 3
3 4 5 2 9 8 0 7 1 6
5 9 6 0 2 3 1 4 8 7
8 5 7 6 3 0 4 1 2 9

rez_ODLK1_Jun_F1.txt -: in: 486061 not lk: 0 not dlk: 0 dup: 4 new uniqe kf dlk: 486057
db unique kf dlk: 3010838
Processing rez_ODLK1_Jul_F1.txt : 407150
duplicate (orig) :
0 1 2 3 4 5 6 7 8 9
1 2 3 7 6 9 0 8 4 5
8 6 1 9 5 4 7 0 3 2
6 0 9 8 3 2 1 4 5 7
5 7 6 4 9 3 8 2 1 0
3 4 7 5 2 6 9 1 0 8
2 8 5 6 7 0 4 3 9 1
9 3 8 1 0 7 2 5 6 4
4 5 0 2 8 1 3 9 7 6
7 9 4 0 1 8 5 6 2 3

kf (lin=44) of the duplicate:
0 2 6 9 5 8 7 4 3 1
4 1 9 5 7 6 8 3 2 0
3 6 2 7 8 9 1 0 4 5
5 0 8 3 6 1 4 2 9 7
8 9 3 1 4 7 5 6 0 2
2 4 7 0 9 5 3 1 6 8
1 7 4 8 2 0 6 9 5 3
9 8 5 6 0 3 2 7 1 4
7 3 0 4 1 2 9 5 8 6
6 5 1 2 3 4 0 8 7 9

rez_ODLK1_Jul_F1.txt -: in: 407150 not lk: 0 not dlk: 0 dup: 1 new uniqe kf dlk: 407149
db unique kf dlk: 3417987
Processing rez_ODLK1_aug.txt : 400553
duplicate (orig) :
0 1 2 3 4 5 6 7 8 9
1 2 3 8 6 9 4 0 5 7
5 9 4 7 0 8 2 3 6 1
4 5 9 6 2 3 7 8 1 0
8 3 0 9 7 6 5 1 2 4
7 0 6 4 8 1 3 2 9 5
3 8 5 0 1 2 9 4 7 6
6 7 1 2 9 0 8 5 4 3
9 4 8 1 5 7 0 6 3 2
2 6 7 5 3 4 1 9 0 8

kf (lin=1) of the duplicate:
0 2 3 6 5 4 8 9 7 1
4 1 7 5 8 9 3 6 0 2
1 8 2 9 0 7 4 3 6 5
9 7 1 3 6 0 2 5 4 8
8 3 9 2 4 6 1 0 5 7
2 4 0 8 7 5 9 1 3 6
7 0 5 4 2 1 6 8 9 3
6 5 8 1 9 3 0 7 2 4
3 9 6 7 1 2 5 4 8 0
5 6 4 0 3 8 7 2 1 9

rez_ODLK1_aug.txt -: in: 400553 not lk: 0 not dlk: 0 dup: 1 new uniqe kf dlk: 400552
db unique kf dlk: 3818539
Processing rez_ODLK1_Sep_F2.txt : 407995
rez_ODLK1_Sep_F2.txt -: in: 407995 not lk: 0 not dlk: 0 dup: 0 new uniqe kf dlk: 407995
db unique kf dlk: 4226534
Processing rez_ODLK1_Oct_F2.txt : 404478
duplicate (orig) :
0 2 3 4 5 7 8 6 9 1
6 1 5 8 3 9 7 4 0 2
5 4 2 6 0 1 9 3 7 8
7 6 4 3 9 8 2 1 5 0
3 8 0 7 4 6 1 9 2 5
9 3 1 2 7 5 0 8 4 6
8 7 9 5 1 2 6 0 3 4
2 0 8 9 6 4 5 7 1 3
1 9 6 0 2 3 4 5 8 7
4 5 7 1 8 0 3 2 6 9

duplicate (orig) :
0 3 4 8 5 2 9 6 7 1
3 1 5 4 6 9 7 2 0 8
5 0 2 6 9 8 1 3 4 7
7 8 1 3 0 4 2 9 6 5
2 5 3 7 4 6 8 1 9 0
1 6 0 9 7 5 4 8 3 2
8 7 9 5 3 1 6 0 2 4
9 4 8 1 2 3 0 7 5 6
6 9 7 2 1 0 5 4 8 3
4 2 6 0 8 7 3 5 1 9

rez_ODLK1_Oct_F2.txt -: in: 404478 not lk: 0 not dlk: 0 dup: 2 new uniqe kf dlk: 404476
db unique kf dlk: 4631010
Processing rez_ODLK1_Nov_F2.txt : 355634
rez_ODLK1_Nov_F2.txt -: in: 355634 not lk: 0 not dlk: 0 dup: 0 new uniqe kf dlk: 355634
db unique kf dlk: 4986644
Processing rez_ODLK1_Dec_F2.txt : 343240
rez_ODLK1_Dec_F2.txt -: in: 343240 not lk: 0 not dlk: 0 dup: 0 new uniqe kf dlk: 343240
db unique kf dlk: 5329884

There are
4 in rez_ODLK1_may_F1.txt,
4 in rez_ODLK1_Jun_F1.txt,
1 in rez_ODLK1_Jul_F1.txt,
1 in rez_ODLK1_aug.txt, and
2 in rez_ODLK1_Oct_F2.txt.
14) Message boards : Number crunching : The processing of results (Message 676)
Posted 21 Jan 2019 by Tomas Brada
Post:
CFs LS may be less than CFs DLS. It is right.

First step in zamyk.bat is converting the input into CF LK. Can that result in a loss of solutions?
15) Message boards : Number crunching : The processing of results (Message 672)
Posted 20 Jan 2019 by Tomas Brada
Post:
Arithmetic union of DB parts contains 5,329,896 CFs ODLS.

I've got the same number of squares: 5'329'896, but that is not union, just concatenation.
However, kanonizator_dlk found only: 5'329'884 CF DLK, that is 12 difference.
Program kanonizator_lk reports that is can only read 5329884 squares and found only 5212580 KF LK.
My program reports only 10 duplicate squares (disregarding format).

We need to check where the inconsistency comes from.
16) Message boards : Number crunching : The processing of results (Message 661)
Posted 20 Jan 2019 by Tomas Brada
Post:
https://boinc.multi-pool.info/latinsquares/forum_thread.php?id=57&postid=656
Natalia Makarova
So we have 226322 +184328 = 410650 CFs.

I got exactly that many (410650) Mariažne CF DLK with my postprocessing program. Looks like I did not make a mistake in the program. (batch Sep 2018)
From them:
Found Fancy CF:
count[1]: 409774
count[2]: 876
All: 410650
Found CF co-squares: 403590

But I did not perform check of uniqueness agains the 5M database. Only (implicit) check withing the file itself.

What form is your database? Is it just a text file? With that many entries it should be a real database, like sqlite, postgress.
If it is just a file, program to check for uniqueness is very easy to make.
17) Message boards : News : Autonomous Subproject (Message 655)
Posted 19 Jan 2019 by Tomas Brada
Post:
I published compatible programs for processing results to my repository.
They are two programs: postprocess and ortogoncnt. Both are multithreaded programs and work on linux system. They should work on windows too, if compiled.
Postprocess program reads the results from boinc project, finds additional CF ODLSs (like in zamyk.bat) and adds them to database file.
Ortogencnt program removes duplicate squares from database, computes statistics and outputs out_ortogon.txt, out_kf_X.txt, and out_kf_mates.txt files, like zamyk.bat.
More information in the Readme file.

The first program (postprocess) should have been part of the boinc application, it would save a lot of time in result processing. I think it should be added to new PADLS boinc application.

I am currently processing results from September and I will compare them to XAVER's result to validate. Hopefully I have not included error.
18) Message boards : News : Autonomous Subproject (Message 640)
Posted 18 Jan 2019 by Tomas Brada
Post:
(me) I can give file with CF ODLSs together with their KF co-squares (like in your diagram).

Ignore this. This does not make sense to do and is different than diagram.
19) Message boards : News : Autonomous Subproject (Message 636)
Posted 18 Jan 2019 by Tomas Brada
Post:
Natalia Makarova
I made a very good progress on the result processing. But I need help. What outputs are desired? And in what format?
You want the CF ODL squares, that is for sure.
Then cf odl squares grouped by number of co-squares of each.
And then number of squares in each group ( count[1] = 1898 count[2] = 12 ... ).
I can give file with CF ODLSs together with their KF co-squares (like in your diagram).
I can make the output whatever you want!
Or should I replicate exact format as Belyshev's programs?

Second: in family_mar program, there is condition:
if(!is_simm(kf, lin))
, so the program will not produce symmetric(?) solutions. However the program klpmd does not have such condition. Is that correct? Which should be used for the PADLS experiment?
20) Message boards : News : Autonomous Subproject (Message 630)
Posted 17 Jan 2019 by Tomas Brada
Post:
We have problems processing the results in the ODLK1 project. Processing of the results for September - December is still not performed. Project results are published not processed.
See
https://boinc.multi-pool.info/latinsquares/forum_thread.php?id=56
You can download not processed results for September - December here
https://drive.google.com/open?id=1wIf97_bLDvnb1WalOsjU9vODZqRZ_WNw
Can you process these results?

I will try to process the results.

I am not running the PADLS experiment now. However I made a good progress implementing the BOINC API. It is close to finish. I will run the experiment thru boinc once it is made.

I found in the PADLS experiment two groups of five pairs of ODLS

I am happy that you are receiving good results.

XAVER Are you using programs from my repository? Because I introduced a error there.


Next 20


©2019 Progger & Stefano Tognon (ice00)