Font API Version 2.1.9 — Release Archives/Packs


Font API Version 2.1.9

Release Archives/Packs

Here at Chronolabs Cooperative (wishcraft) we have been working on a new version 2.x of the Font’s API which is in release of version 2.1.x. This is without the unison cloud peering system, which will be additional in version 2.2.x coming out in a month or two.

The new font API you can see operating at http://fonts.labs.coop which is a complete Debian/Ubuntu solution. To Install there is quite a bit of configuration and you will have to be able to read the code as well as: apt-get install fontforge zip unzip bzip 7zip rar — and so on with the systems required to be installed as well as in some of the cronjobs you will require user permissions to be set with chown and chmod calls.

You can download the font API version 2.1.9 from sourceforge here: https://sourceforge.net/projects/chronolabsapis/files/fonts.labs.coop/Version%202.x/2.1.9/

In version 2.2.x we will be including a peer-to-peer cloud font sharing system without reproduction, as well as glyph output previews and be rounding of any callback and other options involved in the peer-2-peer font cloud sharing solution with a peering identify checksum!

This is the final revision for minor revision of 2.1!

Advertisements

Not all of mozilla or chrome’s fonts work!!


I have been writing a font API that provides all font libraries possible for the system to use; I have version one running on https://fonts.ringwould.com.au and the new version 2.x on http://fonts.labs.coop. The interesting thing is in version 1 we only provide the following CSS:-

/** Font: Reef **/ @font-face { font-family: 'Reef'; src: url('http://fonts.ringwould.com.au/v1/font/160bf7ba5405eaa7d027644692b30f74/eot.api'); src: local('|'), url('http://fonts.ringwould.com.au/v1/font/160bf7ba5405eaa7d027644692b30f74/woff.api') format('woff'), url('http://fonts.ringwould.com.au/v1/font/160bf7ba5405eaa7d027644692b30f74/otf.api') format('truetype'), url('http://fonts.ringwould.com.au/v1/font/160bf7ba5405eaa7d027644692b30f74/ttf.api') format('truetype'), url('http://fonts.ringwould.com.au/v1/font/160bf7ba5405eaa7d027644692b30f74/svg.api') format('svg'), url('http://fonts.ringwould.com.au/v1/font/160bf7ba5405eaa7d027644692b30f74/afm.api') format('afm'); font-weight: normal; font-style: normal; }

But in Version 2.x we are importing all the device supportive version available like so:-

/** Font: ae_Tholoth **/
@font-face {
	font-family: 'ae_Tholoth';
	src: url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/eot.api');
	src: local('||'), url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/afm.api') format('afm');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/bdf.api') format('bdf');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/bin.api') format('bin');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/cef.api') format('cef');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/cff.api') format('cff');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/cid.api') format('cid');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/dfont.api') format('dfont');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/eot.api') format('eot');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/fnt.api') format('fnt');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/fon.api') format('fon');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/gai.api') format('gai');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/gsf.api') format('gsf');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/hqx.api') format('hqx');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/ik.api') format('ik');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/mf.api') format('mf');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/otb.api') format('otb');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/otf.api') format('otf');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/pcf.api') format('pcf');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/pdb.api') format('pdb');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/pf3.api') format('pf3');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/pfa.api') format('pfa');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/pfb.api') format('pfb');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/pmf.api') format('pmf');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/pt3.api') format('pt3');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/sfd.api') format('sfd');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/svg.api') format('svg');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/t42.api') format('t42');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/ttc.api') format('ttc');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/ttf.api') format('ttf');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/ufo.api') format('ufo');, url('http://fonts.labs.coop/v2/font/e786348c90061b278174bee9ede36b79/woff.api') format('woff');
	font-weight: normal;
	font-style: normal;
}

You will see in the difference of the two browsers; the fonts preview will display in one; but the browser is opting for another version of font apart from what the examples online show just: woff, eot, oft, ttf + svg as the base preference but isn’t using it at all or not able to read the format correctly. Partly why this is the *.UFO format; what do you think displays a *.UFO File for a font layering format.

I am probably to the CSS system cause of this flaw in all browsers; going to have to introduce a Mime-type based output on the User Agent of what is browsing the API.

Open Torrent Tracker API v2.0.1


Over night I started a torrent tracker api version 2.0.1. The tracker part of it is complete it also has peer notification with the tracker that will cloud the api with other sessions of the api sharing seeds and torrent peers within it frameworks.

The strategy of this is too keep torrents more alive and it does this thru a callback api and cron scheduled task set to as lower time as possible.

You can download this from Chronolabs APIs Files

Posted from WordPress for Android @Cipherhouse

Installing Subsonic 5/6 on Ubuntu 16.04 LTS (Xenial or Earlier Version) !! (Complete method)


I notice installing the copy of subsonic 5.x from http://subsonic.org with the *.deb download is fraught with issues; follow the following instructions to install subsonic 5.x with no issues on Ubuntu 15.04.

Can you see this running at http://media.labs.coop

Open your terminal by pressing alt + ctrl + t

Now run the following command one by one…

$ cd ~/
$ wget -q -O - http://archive.getdeb.net/getdeb-archive.key | sudo apt-key add -
$ sudo sh -c 'echo "deb http://archive.getdeb.net/ubuntu xenial-getdeb apps" >> /etc/apt/sources.list.d/getdeb.list'
$ sudo apt-get update
$ sudo apt-get install openjdk-8-jre -y
$ sudo apt-get install subsonic -y

Change the user Subsonic runs as (by default it’s root)

sudo nano /etc/default/subsonic

Change SUBSONIC_USER to your main user (like www-data) * NOT REQUIRED

SUBSONIC_USER="www-data"

If you want to change the SUBSONIC port and memory adjust values in this line

SUBSONIC_ARGS="--init-memory=256 --max-memory=512"

If you want to limit memory and change the port (the default is 4040) or add https, then add these values to the uncommented line (the one without the #)

SUBSONIC_ARGS="--port=8080 --https-port=8443 --init-memory=256 --max-memory=512"

Ctrl+X, Y and Enter to save, you’ll restart Subsonic after installing transcoding tools

Install Subsonic transcoding tools for Ubuntu, this will work on 14.x, 15.x and later as long as libav-tools exists

sudo apt-get install libav-tools xmp lame flac -y

Remove the links to the default Subsonic packages

sudo rm /var/subsonic/transcode/ffmpeg
sudo rm /var/subsonic/transcode/lame
sudo rm /var/subsonic/transcode/xmp
sudo rm /var/subsonic/transcode/flac

Subsonic uses ffmpeg so we are tricking the symlink into using avconv.

sudo ln -s /usr/bin/avconv /var/subsonic/transcode/ffmpeg
sudo ln -s /usr/bin/flac /var/subsonic/transcode/flac
sudo ln -s /usr/bin/xmp /var/subsonic/transcode/xmp
sudo ln -s /usr/bin/lame /var/subsonic/transcode/lame

Now you have installed Subsonic on Ubuntu and can begin configuring it and adding your media.

Entities.labs.coop ~ Pre-alpha ! 2.1.6v


I added a few more of the required libraries to entities.labs.coop; the follow-on files where built in over night, I tend to do no programming in the light of day; there is too much physical energy in the air with the light + using manual input!

So to http://entities.labs.coop the following files where added to the Subversions:

  1. /avartas/.htaccess — avarta’s storage folder
  2. /avartas/small/.htaccess — small 80×80 avarta’s storage folder
  3. /avartas/medium/.htaccess — medium 160×160 avarta’s storage
  4. /avartas/large/.htaccess — large 200×200 avarta’s storage folder
  5. /avartas/original/.htaccess — original file avarta’s storage folder
  6. /class/WideImage/* — WideImage was from @sourceforge projects
  7. /cronjobs/gravatar-mapping.php — Gravatar.com crawler
  8. /data/.htaccess
  9. /data/200×200.png — logo for emails
  10. /examples/.htaccess
  11. /sql/.htaccess
  12. /sql/entities-labs-coop_avatars_emails.sql
  13. /sql/entities-labs-coop_avatars_entities.sql
  14. /sql/entities-labs-coop_peers.sql
  15. /entity-edit-form.php
  16. /entity-edit-password.php
  17. /entity-edit-split.php
  18. /entity-edit.php
  19. /entity-view-form.php
  20. /entity-view-password.php
  21. /entity-view.php
  22. /logo-verify.php

Making a Callback API Cache Correctly for Bigger Systems


Making a callback cache for doing API callbacks is quite straight forward, if you where doing for this say a telecommunication company that had to notify many nearly there whole assumed customer base of functions happening on a phone say when an SMS was received by the network, then another call back for when the SMS or MMS was received on the device then this combined with call and duration and all the other stuff most people don’t get to use cause of limitation in the devices we use then this is how you make a workable callback cache.

See most people believe as a programmer that a call back is done at the point of contact for the notification this is not the cache, what it would do at this point is make a checksum cache file say fkjdsf8h934yt89h48h834h3.json for that call back; in a path structure like so ie. /function/callback-base-domain/callback-subdomains/client-username/kjdsf8h934yt89h48h834h3.json

You would then have a function which scoops say 200 of these every 4 minutes, as the curl timeout could be and best set for upto 360 seconds in response and connection timeout; if it fails say 7 times in the path of /function/callback-base-domain/callback-subdomains/client-username/ and you have a little json file in there that tracks the call fail and success rate, then you make another json file which the scoop search for like sandboxed.json and this just contains the timeout when it is unsandboxed and if the scoop finding this just drops the sandbox, if it say get sandboxed over 7 days you make that call back offline for near complete fail rates; otherwise 3/4 days in you send a warning notice.

See these systems are only maintained by programmers and generally ones of a resounding caliber or they will be websites where you put in your username and api key for the REST API. You need to make a system also for these website where someone can with either a TXT or temporary cname for a lookup to versify or even simple html or file verification, where they can put a callback when they are offering 3rd party services from your API which does happen where they get basic user information, like has credit (true/false) disconnect just as a few true false and the username without any API Key information, they can use this when they have it to query lesser sensitive information.

This sort of callback on the API is easily manageable you can have say 50 calls to the php with crontab with variables to specify on the cURL header information how many and what they do which all vary from every 1 minute to ever 9 minutes called with different quantities and thresholds; you would be able to make it also start up with a session handler more crons as they are required just by using the callback api file cache size in the writing and unlinking of these files which are not required to be stored after their system they are only a system typal of the moment, you have all that in the database anyway.

As long as you report the timezone and the time of the action, it is okey if you present the callback in delay of the function on the network; this is how any honed programmer would assume it would be under basis of anyway!!!

android-and-rest-38-638