Skip to content

Commit

Permalink
Fix spelling
Browse files Browse the repository at this point in the history
Includes some minor grammatical fixes as well.

Notable change:

`JOURNAL_SAFEY` -> `JOURNAL_SAFE`

Signed-off-by: Josh Soref <[email protected]>
  • Loading branch information
jsoref authored and rschlussel committed Mar 7, 2022
1 parent 7c1b764 commit 5c6efc7
Show file tree
Hide file tree
Showing 237 changed files with 548 additions and 548 deletions.
2 changes: 1 addition & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ Presto welcomes contributions from everyone.
Contributions to Presto should be made in the form of GitHub pull request submissions and reviews.

Each pull request submission will be reviewed by a contributor or [committer](https://github.com/prestodb/presto/wiki/committers)
in the project. Only committers may merge a pull request. Large contributions should have an associated Github issue.
in the project. Only committers may merge a pull request. Large contributions should have an associated GitHub issue.

Pull request reviews are encouraged for anyone in the community who would like to contribute to Presto, and are
expected from contributors and committers in at least equal proportion to their code contributions.
Expand Down
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -95,11 +95,11 @@ To learn how to build the docs, see the [docs README](presto-docs/README.md).

## Building the Web UI

The Presto Web UI is composed of several React components and is written in JSX and ES6. This source code is compiled and packaged into browser-compatible Javascript, which is then checked in to the Presto source code (in the `dist` folder). You must have [Node.js](https://nodejs.org/en/download/) and [Yarn](https://yarnpkg.com/en/) installed to execute these commands. To update this folder after making changes, simply run:
The Presto Web UI is composed of several React components and is written in JSX and ES6. This source code is compiled and packaged into browser-compatible JavaScript, which is then checked in to the Presto source code (in the `dist` folder). You must have [Node.js](https://nodejs.org/en/download/) and [Yarn](https://yarnpkg.com/en/) installed to execute these commands. To update this folder after making changes, simply run:

yarn --cwd presto-main/src/main/resources/webapp/src install

If no Javascript dependencies have changed (i.e., no changes to `package.json`), it is faster to run:
If no JavaScript dependencies have changed (i.e., no changes to `package.json`), it is faster to run:

yarn --cwd presto-main/src/main/resources/webapp/src run package

Expand Down
2 changes: 1 addition & 1 deletion mvnw
Original file line number Diff line number Diff line change
Expand Up @@ -130,7 +130,7 @@ if $cygwin ; then
CLASSPATH=`cygpath --path --unix "$CLASSPATH"`
fi

# For Migwn, ensure paths are in UNIX format before anything is touched
# For Mingw, ensure paths are in UNIX format before anything is touched
if $mingw ; then
[ -n "$M2_HOME" ] &&
M2_HOME="`(cd "$M2_HOME"; pwd)`"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -204,7 +204,7 @@ private static void validateColumns(ConnectorTableMetadata meta)
if (columnMapping.get().values().stream()
.filter(pair -> pair.getKey().equals(reservedRowIdColumn) && pair.getValue().equals(reservedRowIdColumn))
.count() > 0) {
throw new PrestoException(INVALID_TABLE_PROPERTY, format("Column familiy/qualifier mapping of %s:%s is reserved", reservedRowIdColumn, reservedRowIdColumn));
throw new PrestoException(INVALID_TABLE_PROPERTY, format("Column family/qualifier mapping of %s:%s is reserved", reservedRowIdColumn, reservedRowIdColumn));
}
}
else if (AccumuloTableProperties.isExternal(meta.getProperties())) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@
/**
* Presto module to do all kinds of run Guice injection stuff!
* <p>
* WARNING: Contains black magick
* WARNING: Contains black magic
*/
public class AccumuloModule
implements Module
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -371,7 +371,7 @@ private List<Range> getIndexRanges(String indexTable, Multimap<AccumuloColumnCon

private static void binRanges(int numRangesPerBin, List<Range> splitRanges, List<TabletSplitMetadata> prestoSplits)
{
checkArgument(numRangesPerBin > 0, "number of ranges per bin must positivebe greater than zero");
checkArgument(numRangesPerBin > 0, "number of ranges per bin must be greater than zero");
int toAdd = splitRanges.size();
int fromIndex = 0;
int toIndex = Math.min(toAdd, numRangesPerBin);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -281,7 +281,7 @@ public void close()
* so we can configure Accumulo to only give us the first key/value pair in the row
*
* @param rowIdName Row ID column name
* @return True if scanner should retriev eonly row IDs, false otherwise
* @return True if scanner should retrieve only row IDs, false otherwise
*/
private boolean retrieveOnlyRowIds(String rowIdName)
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ public AccumuloRecordSet(
throw new PrestoException(NOT_FOUND, "Failed to factory serializer class. Is it on the classpath?", e);
}

// Save off the column handles and createa list of the Accumulo types
// Save off the column handles and create a list of the Accumulo types
this.columnHandles = requireNonNull(columnHandles, "column handles is null");
ImmutableList.Builder<Type> types = ImmutableList.builder();
for (AccumuloColumnHandle column : columnHandles) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ public SqlJoniRegexpBenchmark(LocalQueryRunner localQueryRunner, String query, S

public static void main(String[] args)
{
new SqlJoniRegexpBenchmark(createLocalQueryRunner(), "SELECT array_agg(regexp_extract_all(comment||cast(random() as varchar), '[a-z]* ')) FROM orders cross join unnest(sequence(1, 10))", "sql_regexp_extract_alll").runBenchmark(new SimpleLineBenchmarkResultWriter(System.out));
new SqlJoniRegexpBenchmark(createLocalQueryRunner(), "SELECT array_agg(regexp_extract_all(comment||cast(random() as varchar), '[a-z]* ')) FROM orders cross join unnest(sequence(1, 10))", "sql_regexp_extract_all").runBenchmark(new SimpleLineBenchmarkResultWriter(System.out));
new SqlJoniRegexpBenchmark(createLocalQueryRunner(), "SELECT array_agg(regexp_replace(comment||cast(random() as varchar), '[a-z]* ', cast(random() as varchar))) FROM orders cross join unnest(sequence(1, 10))", "sql_regexp_replace").runBenchmark(new SimpleLineBenchmarkResultWriter(System.out));
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ FROM
AND ("ss_store_sk" = "s_store_sk")
AND ("d_month_seq" IN (1200 , (1200 + 1) , (1200 + 2) , (1200 + 3) , (1200 + 4) , (1200 + 5) , (1200 + 6) , (1200 + 7) , (1200 + 8) , (1200 + 9) , (1200 + 10) , (1200 + 11)))
AND ((("i_category" IN ('Books' , 'Children' , 'Electronics'))
AND ("i_class" IN ('personal' , 'portable' , 'refernece' , 'self-help'))
AND ("i_class" IN ('personal' , 'portable' , 'reference' , 'self-help'))
AND ("i_brand" IN ('scholaramalgamalg #14' , 'scholaramalgamalg #7' , 'exportiunivamalg #9' , 'scholaramalgamalg #9')))
OR (("i_category" IN ('Women' , 'Music' , 'Men'))
AND ("i_class" IN ('accessories' , 'classical' , 'fragrances' , 'pants'))
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ private static BytecodeBlock generateBytecode(ParameterizedType sourceType, Para
case PRIMITIVE:
castPrimitiveToPrimitive(block, sourceType.getPrimitiveType(), targetType.getPrimitiveType());
return block;
case BOXED_PRIMITVE:
case BOXED_PRIMITIVE:
checkArgument(sourceType.getPrimitiveType() == unwrapPrimitiveType(targetType), "Type %s can not be cast to %s", sourceType, targetType);
return block.invokeStatic(targetType, "valueOf", targetType, sourceType);
case OTHER:
Expand All @@ -76,12 +76,12 @@ private static BytecodeBlock generateBytecode(ParameterizedType sourceType, Para
.invokeStatic(wrap(sourceClass), "valueOf", wrap(sourceClass), sourceClass)
.checkCast(targetType);
}
case BOXED_PRIMITVE:
case BOXED_PRIMITIVE:
switch (getTypeKind(targetType)) {
case PRIMITIVE:
checkArgument(unwrapPrimitiveType(sourceType) == targetType.getPrimitiveType(), "Type %s can not be cast to %s", sourceType, targetType);
return block.invokeVirtual(sourceType, targetType.getPrimitiveType().getSimpleName() + "Value", targetType);
case BOXED_PRIMITVE:
case BOXED_PRIMITIVE:
checkArgument(sourceType.equals(targetType), "Type %s can not be cast to %s", sourceType, targetType);
return block;
case OTHER:
Expand All @@ -94,7 +94,7 @@ private static BytecodeBlock generateBytecode(ParameterizedType sourceType, Para
return block
.checkCast(wrap(targetType.getPrimitiveType()))
.invokeVirtual(wrap(targetType.getPrimitiveType()), targetType.getPrimitiveType().getSimpleName() + "Value", targetType.getPrimitiveType());
case BOXED_PRIMITVE:
case BOXED_PRIMITIVE:
case OTHER:
return block.checkCast(targetType);
}
Expand Down Expand Up @@ -282,7 +282,7 @@ private static TypeKind getTypeKind(ParameterizedType type)
return TypeKind.PRIMITIVE;
}
if (unwrapPrimitiveType(type) != null) {
return TypeKind.BOXED_PRIMITVE;
return TypeKind.BOXED_PRIMITIVE;
}
return TypeKind.OTHER;
}
Expand Down Expand Up @@ -325,6 +325,6 @@ public List<BytecodeNode> getChildNodes()

private enum TypeKind
{
PRIMITIVE, BOXED_PRIMITVE, OTHER
PRIMITIVE, BOXED_PRIMITIVE, OTHER
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ public void testCastBetweenObjectAndPrimitive()
}

@Test
public void testInvalildCast()
public void testInvalidCast()
{
// Cast between a boxed primitive and a primitive that are different
assertInvalidCast(getStatic(getClass(), "INT_FIELD"), Double.class);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -407,7 +407,7 @@ private boolean write(FileReadRequest key, byte[] data, Path newFilePath)

// no lock is needed for the following operation
if (updated) {
// remove the the previous or following file as well
// remove the previous or following file as well
if (previousCacheFile != null) {
cacheFilesToDelete.add(previousCacheFile.getPath());
}
Expand Down
8 changes: 4 additions & 4 deletions presto-cassandra/src/test/resources/cu-cassandra.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@ permissions_validity_in_ms: 2000
#
# - RandomPartitioner distributes rows across the cluster evenly by md5.
# This is the default prior to 1.2 and is retained for compatibility.
# - Murmur3Partitioner is similar to RandomPartioner but uses Murmur3_128
# - Murmur3Partitioner is similar to RandomPartitioner but uses Murmur3_128
# Hash Function instead of md5. When in doubt, this is the best option.
# - ByteOrderedPartitioner orders rows lexically by key bytes. BOP allows
# scanning rows in key order, but the ordering can generate hot spots
Expand Down Expand Up @@ -525,10 +525,10 @@ request_scheduler: org.apache.cassandra.scheduler.NoScheduler
# the request scheduling. Currently the only valid option is keyspace.
# request_scheduler_id: keyspace

# index_interval controls the sampling of entries from the primrary
# index_interval controls the sampling of entries from the primary
# row index in terms of space versus time. The larger the interval,
# the smaller and less effective the sampling will be. In technicial
# terms, the interval coresponds to the number of index entries that
# the smaller and less effective the sampling will be. In technical
# terms, the interval corresponds to the number of index entries that
# are skipped between taking each sample. All the sampled entries
# must fit in memory. Generally, a value between 128 and 512 here
# coupled with a large key cache size on CFs results in the best trade
Expand Down
4 changes: 2 additions & 2 deletions presto-cli/src/main/java/com/facebook/presto/cli/Pager.java
Original file line number Diff line number Diff line change
Expand Up @@ -163,10 +163,10 @@ public static Pager create(List<String> command)

private static Pager createNullPager()
{
return new Pager(uncloseableOutputStream(System.out), null);
return new Pager(unclosableOutputStream(System.out), null);
}

private static OutputStream uncloseableOutputStream(OutputStream out)
private static OutputStream unclosableOutputStream(OutputStream out)
{
return new FilterOutputStream(out)
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -148,7 +148,7 @@ public static class ClientTypeSignatureParameterDeserializer
private static final ObjectMapper MAPPER = new JsonObjectMapperProvider().get();

@Override
public ClientTypeSignatureParameter deserialize(JsonParser jp, DeserializationContext ctxt)
public ClientTypeSignatureParameter deserialize(JsonParser jp, DeserializationContext ctx)
throws IOException
{
JsonNode node = jp.getCodec().readTree(jp);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -113,7 +113,7 @@ public static void quickSort(final int[][] x, final long from, final long to, fi
if (len > SMALL) {
long l = from;
long n = to - 1;
if (len > MEDIUM) { // Big arrays, pseudomedian of 9
if (len > MEDIUM) { // Big arrays, pseudo-median of 9
long s = len / 8;
l = med3(x, l, l + s, l + 2 * s, comp);
m = med3(x, m - s, m, m + s, comp);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -62,12 +62,12 @@ public static TupleDomainFilter toFilter(Domain domain)
boolean nullAllowed = domain.isNullAllowed();

if (values.isAll()) {
checkArgument(!nullAllowed, "Unexpected allways-true filter");
checkArgument(!nullAllowed, "Unexpected always-true filter");
return IS_NOT_NULL;
}

if (values.isNone()) {
checkArgument(nullAllowed, "Unexpected allways-false filter");
checkArgument(nullAllowed, "Unexpected always-false filter");
return IS_NULL;
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -359,7 +359,7 @@ LongEnumMap getLongEnumMap()
VarcharEnumMap getVarcharEnumMap()
{
checkArgument(!isBigintEnum, "Invalid enum map format");
// Varchar enum values are base32-encoded so that they are case-insensitive, which is expected of TypeSigntures
// Varchar enum values are base32-encoded so that they are case-insensitive, which is expected of TypeSignatures
Base32 base32 = new Base32();
return new VarcharEnumMap(
typeName,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1434,7 +1434,7 @@ static int[] shiftLeftMultiPrecision(int[] number, int length, int shifts)
}
// wordShifts = shifts / 32
int wordShifts = shifts >>> 5;
// we don't wan't to loose any leading bits
// we don't want to lose any leading bits
for (int i = 0; i < wordShifts; i++) {
checkState(number[length - i - 1] == 0);
}
Expand All @@ -1445,7 +1445,7 @@ static int[] shiftLeftMultiPrecision(int[] number, int length, int shifts)
// bitShifts = shifts % 32
int bitShifts = shifts & 0b11111;
if (bitShifts > 0) {
// we don't wan't to loose any leading bits
// we don't want to lose any leading bits
checkState(number[length - 1] >>> (Integer.SIZE - bitShifts) == 0);
for (int position = length - 1; position > 0; position--) {
number[position] = (number[position] << bitShifts) | (number[position - 1] >>> (Integer.SIZE - bitShifts));
Expand All @@ -1463,7 +1463,7 @@ static int[] shiftRightMultiPrecision(int[] number, int length, int shifts)
}
// wordShifts = shifts / 32
int wordShifts = shifts >>> 5;
// we don't wan't to loose any trailing bits
// we don't want to lose any trailing bits
for (int i = 0; i < wordShifts; i++) {
checkState(number[i] == 0);
}
Expand All @@ -1474,7 +1474,7 @@ static int[] shiftRightMultiPrecision(int[] number, int length, int shifts)
// bitShifts = shifts % 32
int bitShifts = shifts & 0b11111;
if (bitShifts > 0) {
// we don't wan't to loose any trailing bits
// we don't want to lose any trailing bits
checkState(number[0] << (Integer.SIZE - bitShifts) == 0);
for (int position = 0; position < length - 1; position++) {
number[position] = (number[position] >>> bitShifts) | (number[position + 1] << (Integer.SIZE - bitShifts));
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -118,7 +118,7 @@ public boolean equals(Object o)
@Override
public String toString()
{
// Varchar enum values are base32-encoded so that they are case-insensitive, which is expected of TypeSigntures
// Varchar enum values are base32-encoded so that they are case-insensitive, which is expected of TypeSignatures
Base32 base32 = new Base32();
return format("%s{%s}", typeName, enumMap.entrySet().stream()
.sorted(Comparator.comparing(Map.Entry::getKey))
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -526,7 +526,7 @@ void encode(final byte[] in, int inPos, final int inAvail, final Context context
* Returns whether or not the {@code octet} is in the Base32 alphabet.
*
* @param octet The value to test
* @return {@code true} if the value is defined in the the Base32 alphabet {@code false} otherwise.
* @return {@code true} if the value is defined in the Base32 alphabet {@code false} otherwise.
*/
@Override
public boolean isInAlphabet(final byte octet)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -334,7 +334,7 @@ public byte[] decode(final byte[] pArray)
* Encodes a byte[] containing binary data, into a byte[] containing characters in the alphabet.
*
* @param pArray a byte array containing binary data
* @return A byte array containing only the basen alphabetic character data
* @return A byte array containing only the base-n alphabetic character data
*/
public byte[] encode(final byte[] pArray)
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@
import static org.testng.Assert.assertTrue;

@Test(singleThreaded = true)
public class TestBlockFlattenner
public class TestBlockFlattener
{
private ArrayAllocator allocator;
private BlockFlattener flattener;
Expand Down
2 changes: 1 addition & 1 deletion presto-docs/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ SOURCEDIR = src/main/sphinx

# User-friendly check for sphinx-build
ifeq ($(shell which $(PYTHON3) >/dev/null 2>&1; echo $$?), 1)
$(error The '$(PYTHON3)' command was not found. Make sure you have Python3 instaled. You can grab it from https://www.python.org/)
$(error The '$(PYTHON3)' command was not found. Make sure you have Python3 installed. You can grab it from https://www.python.org/)
endif

# Internal variables.
Expand Down
2 changes: 1 addition & 1 deletion presto-docs/src/main/sphinx/admin/resource-groups.rst
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ Resource Group Properties
* ``totalMemoryLimit`` (optional): Specify an absolute value (i.e. ``1GB``)
for the maximum distributed memory a query may consume.

* ``cpuTimeLimit`` (optional): Specify Specify an absolute value (i.e. ``1h``)
* ``cpuTimeLimit`` (optional): Specify an absolute value (i.e. ``1h``)
for the maximum CPU time a query may use.

* ``subGroups`` (optional): list of sub-groups.
Expand Down
4 changes: 2 additions & 2 deletions presto-docs/src/main/sphinx/connector/accumulo.rst
Original file line number Diff line number Diff line change
Expand Up @@ -181,7 +181,7 @@ You can then issue ``INSERT`` statements to put data into Accumulo.
(2 rows)
As you'd expect, rows inserted into Accumulo via the shell or
programatically will also show up when queried. (The Accumulo shell
programmatically will also show up when queried. (The Accumulo shell
thinks "-5321" is an option and not a number... so we'll just make TBL a
little younger.)

Expand Down Expand Up @@ -514,7 +514,7 @@ Serializers

The Presto connector for Accumulo has a pluggable serializer framework
for handling I/O between Presto and Accumulo. This enables end-users the
ability to programatically serialized and deserialize their special data
ability to programmatically serialized and deserialize their special data
formats within Accumulo, while abstracting away the complexity of the
connector itself.

Expand Down
Loading

0 comments on commit 5c6efc7

Please sign in to comment.