Do you know conlang software tools?
Posted: Sat Oct 13, 2018 6:31 am
For saving grammar, vocabulary, etc.
Yeah, I used to use LexiquePro as dedicated dictionary software, but now I just do them in Excel. Grammars I do in Word.
Why about LexiquePro?Curlyjimsam wrote: ↑Sat Oct 13, 2018 3:30 pmYeah, I used to use LexiquePro as dedicated dictionary software, but now I just do them in Excel. Grammars I do in Word.
Do you know how to create more than one conlang there?storyteller232 wrote: ↑Sat Oct 13, 2018 2:19 pm Conworkshop - https://conworkshop.com/ - web-based, havent explored it much. Heard it is hard to figure out but has a ton of features once you learn your way around like the ability to create online courses for your language.
Code: Select all
/*!40101 SET @OLD_CHARACTER_SET_CLIENT=@@CHARACTER_SET_CLIENT */;
/*!40101 SET NAMES utf8 */;
/*!50503 SET NAMES utf8mb4 */;
/*!40014 SET @OLD_FOREIGN_KEY_CHECKS=@@FOREIGN_KEY_CHECKS, FOREIGN_KEY_CHECKS=0 */;
/*!40101 SET @OLD_SQL_MODE=@@SQL_MODE, SQL_MODE='NO_AUTO_VALUE_ON_ZERO' */;
-- Dumping database structure for clgs
CREATE DATABASE IF NOT EXISTS `clgs` /*!40100 DEFAULT CHARACTER SET utf8 COLLATE utf8_bin */;
USE `clgs`;
-- Dumping structure for table clgs.conlang
CREATE TABLE IF NOT EXISTS `conlang` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`name` varchar(50) COLLATE utf8_bin NOT NULL,
`time` int(11) DEFAULT NULL,
`precursor` int(11) DEFAULT NULL,
`setting` int(11) NOT NULL,
PRIMARY KEY (`id`),
KEY `FK_conlang_setting` (`setting`),
KEY `FK_conlang_conlang` (`precursor`),
CONSTRAINT `FK_conlang_conlang` FOREIGN KEY (`precursor`) REFERENCES `conlang` (`id`) ON UPDATE CASCADE,
CONSTRAINT `FK_conlang_setting` FOREIGN KEY (`setting`) REFERENCES `setting` (`id`) ON UPDATE CASCADE
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_bin;
-- Dumping data for table clgs.conlang: ~0 rows (approximately)
/*!40000 ALTER TABLE `conlang` DISABLE KEYS */;
/*!40000 ALTER TABLE `conlang` ENABLE KEYS */;
-- Dumping structure for table clgs.entry
CREATE TABLE IF NOT EXISTS `entry` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`conlang` int(11) NOT NULL,
`entry` varchar(150) COLLATE utf8_bin NOT NULL,
`ipa` varchar(150) COLLATE utf8_bin DEFAULT NULL,
`meaning` varchar(150) COLLATE utf8_bin DEFAULT NULL,
`pos` int(11) DEFAULT NULL,
`inflection` int(11) DEFAULT NULL,
`irregularities` json DEFAULT NULL,
`more` json DEFAULT NULL,
`etymology` int(11) DEFAULT NULL,
PRIMARY KEY (`id`),
KEY `FK_entry_conlang` (`conlang`),
KEY `FK_entry_entry` (`etymology`),
CONSTRAINT `FK_entry_conlang` FOREIGN KEY (`conlang`) REFERENCES `conlang` (`id`) ON UPDATE CASCADE,
CONSTRAINT `FK_entry_entry` FOREIGN KEY (`etymology`) REFERENCES `entry` (`id`) ON UPDATE CASCADE
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_bin;
-- Dumping data for table clgs.entry: ~0 rows (approximately)
/*!40000 ALTER TABLE `entry` DISABLE KEYS */;
/*!40000 ALTER TABLE `entry` ENABLE KEYS */;
-- Dumping structure for table clgs.setting
CREATE TABLE IF NOT EXISTS `setting` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`name` varchar(50) COLLATE utf8_bin NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_bin;
-- Dumping data for table clgs.setting: ~0 rows (approximately)
/*!40000 ALTER TABLE `setting` DISABLE KEYS */;
/*!40000 ALTER TABLE `setting` ENABLE KEYS */;
/*!40101 SET SQL_MODE=IFNULL(@OLD_SQL_MODE, '') */;
/*!40014 SET FOREIGN_KEY_CHECKS=IF(@OLD_FOREIGN_KEY_CHECKS IS NULL, 1, @OLD_FOREIGN_KEY_CHECKS) */;
/*!40101 SET CHARACTER_SET_CLIENT=@OLD_CHARACTER_SET_CLIENT */;
I know DB (I am currently enrolled at a university for Informatics Engineering), and I actually thought of that. But is that just overkill? First, to add example (especially relational one), you have to add to separate "example" table. It is practically impossible to read the example except if you do a query. You then basically creating your own conlanging software just to improve on that.Zju wrote: ↑Sun Oct 14, 2018 3:52 am Having gotten around to it yet, but this is the approximate structure I plan to use for most or all dictionaries in the future:
Once you set up the DB properly, viewing any data and combination thereof is as easy as writing a request.Code: Select all
/*!40101 SET @OLD_CHARACTER_SET_CLIENT=@@CHARACTER_SET_CLIENT */; /*!40101 SET NAMES utf8 */; /*!50503 SET NAMES utf8mb4 */; /*!40014 SET @OLD_FOREIGN_KEY_CHECKS=@@FOREIGN_KEY_CHECKS, FOREIGN_KEY_CHECKS=0 */; /*!40101 SET @OLD_SQL_MODE=@@SQL_MODE, SQL_MODE='NO_AUTO_VALUE_ON_ZERO' */; -- Dumping database structure for clgs CREATE DATABASE IF NOT EXISTS `clgs` /*!40100 DEFAULT CHARACTER SET utf8 COLLATE utf8_bin */; USE `clgs`; -- Dumping structure for table clgs.conlang CREATE TABLE IF NOT EXISTS `conlang` ( `id` int(11) NOT NULL AUTO_INCREMENT, `name` varchar(50) COLLATE utf8_bin NOT NULL, `time` int(11) DEFAULT NULL, `precursor` int(11) DEFAULT NULL, `setting` int(11) NOT NULL, PRIMARY KEY (`id`), KEY `FK_conlang_setting` (`setting`), KEY `FK_conlang_conlang` (`precursor`), CONSTRAINT `FK_conlang_conlang` FOREIGN KEY (`precursor`) REFERENCES `conlang` (`id`) ON UPDATE CASCADE, CONSTRAINT `FK_conlang_setting` FOREIGN KEY (`setting`) REFERENCES `setting` (`id`) ON UPDATE CASCADE ) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_bin; -- Dumping data for table clgs.conlang: ~0 rows (approximately) /*!40000 ALTER TABLE `conlang` DISABLE KEYS */; /*!40000 ALTER TABLE `conlang` ENABLE KEYS */; -- Dumping structure for table clgs.entry CREATE TABLE IF NOT EXISTS `entry` ( `id` int(11) NOT NULL AUTO_INCREMENT, `conlang` int(11) NOT NULL, `entry` varchar(150) COLLATE utf8_bin NOT NULL, `ipa` varchar(150) COLLATE utf8_bin DEFAULT NULL, `meaning` varchar(150) COLLATE utf8_bin DEFAULT NULL, `pos` int(11) DEFAULT NULL, `inflection` int(11) DEFAULT NULL, `irregularities` json DEFAULT NULL, `more` json DEFAULT NULL, `etymology` int(11) DEFAULT NULL, PRIMARY KEY (`id`), KEY `FK_entry_conlang` (`conlang`), KEY `FK_entry_entry` (`etymology`), CONSTRAINT `FK_entry_conlang` FOREIGN KEY (`conlang`) REFERENCES `conlang` (`id`) ON UPDATE CASCADE, CONSTRAINT `FK_entry_entry` FOREIGN KEY (`etymology`) REFERENCES `entry` (`id`) ON UPDATE CASCADE ) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_bin; -- Dumping data for table clgs.entry: ~0 rows (approximately) /*!40000 ALTER TABLE `entry` DISABLE KEYS */; /*!40000 ALTER TABLE `entry` ENABLE KEYS */; -- Dumping structure for table clgs.setting CREATE TABLE IF NOT EXISTS `setting` ( `id` int(11) NOT NULL AUTO_INCREMENT, `name` varchar(50) COLLATE utf8_bin NOT NULL, PRIMARY KEY (`id`) ) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_bin; -- Dumping data for table clgs.setting: ~0 rows (approximately) /*!40000 ALTER TABLE `setting` DISABLE KEYS */; /*!40000 ALTER TABLE `setting` ENABLE KEYS */; /*!40101 SET SQL_MODE=IFNULL(@OLD_SQL_MODE, '') */; /*!40014 SET FOREIGN_KEY_CHECKS=IF(@OLD_FOREIGN_KEY_CHECKS IS NULL, 1, @OLD_FOREIGN_KEY_CHECKS) */; /*!40101 SET CHARACTER_SET_CLIENT=@OLD_CHARACTER_SET_CLIENT */;
Over some certain lexicon size I'd prefer to type in select * from examples where word like 'lorem'; and have the DB do all the work for me rather than searching manually for all examples of that word's usage. And I can save the more frequently used queries as stored procedures.
Maybe that's the goal.
Then you won't bother creating a database for that. It is much harder to distribute a database. Instead, you use XML. You query it using XQuery.
LexiquePro was pretty good, but in the end I just found Excel more versatile.Akangka wrote: ↑Sun Oct 14, 2018 2:20 amWhy about LexiquePro?Curlyjimsam wrote: ↑Sat Oct 13, 2018 3:30 pmYeah, I used to use LexiquePro as dedicated dictionary software, but now I just do them in Excel. Grammars I do in Word.
Also do you know application to create a conmap?
I don't see how XML is much better. SQLite and Microsoft Access use a single file you can easily distribute. With MySQL, you can share the SQL dump.
I.e., sorting by source language, so native vocabulary versus loanwords from Language X, from Language Y, etc. Also versus neologisms. I color code my vocabulary, but if you have a lot of donor languages that becomes tedious...